The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the ALIA logic in the Unsat Core Track.
Page generated on 2023-07-06 16:05:43 +0000
Benchmarks: 401 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
SMTInterpol | SMTInterpol |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
2021-cvc5-ucn | 0 | 1420 | 2230.158 | 2230.596 | 136 | 0 |
SMTInterpol | 0 | 1213 | 13520.958 | 11511.555 | 135 | 0 |
cvc5 | 0 | 754 | 7310.6 | 7311.846 | 134 | 0 |
Vampire | 0 | 167 | 1054.934 | 276.424 | 6 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
2021-cvc5-ucn | 0 | 1420 | 2230.158 | 2230.596 | 136 | 0 |
SMTInterpol | 0 | 1221 | 14725.928 | 12464.654 | 132 | 0 |
cvc5 | 0 | 754 | 7310.6 | 7311.846 | 134 | 0 |
Vampire | 0 | 167 | 1054.934 | 276.424 | 2 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.