The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the UFLRA division in the Single Query Track.
Page generated on 2020-07-04 11:46:59 +0000
Benchmarks: 7 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
---|---|---|---|---|
Alt-Ergo | Alt-Ergo | — | veriT | Alt-Ergo |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
z3n | 0 | 5 | 2769.397 | 2769.486 | 5 | 3 | 2 | 2 | 2 | 0 | |
2018-Z3n | 0 | 5 | 3367.137 | 3367.718 | 5 | 3 | 2 | 2 | 2 | 0 | |
veriT+viten | 0 | 2 | 2478.152 | 2478.414 | 2 | 0 | 2 | 5 | 2 | 0 | |
Alt-Ergo | 0 | 2 | 3593.873 | 3056.142 | 2 | 0 | 2 | 5 | 2 | 0 | |
SMTInterpol-fixedn | 0 | 2 | 4801.761 | 4801.158 | 2 | 0 | 2 | 5 | 4 | 0 | |
SMTInterpol | 0 | 2 | 4801.778 | 4801.194 | 2 | 0 | 2 | 5 | 4 | 0 | |
CVC4 | 0 | 2 | 5996.614 | 6000.434 | 2 | 0 | 2 | 5 | 5 | 0 | |
veriT | 0 | 2 | 5998.615 | 6000.259 | 2 | 0 | 2 | 5 | 5 | 0 | |
Vampire | 0 | 2 | 6005.8 | 6002.504 | 2 | 0 | 2 | 5 | 5 | 0 | |
UltimateEliminator+MathSAT | 0 | 0 | 23.435 | 16.44 | 0 | 0 | 0 | 7 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
z3n | 0 | 5 | 2769.397 | 2769.486 | 5 | 3 | 2 | 2 | 2 | 0 | |
2018-Z3n | 0 | 5 | 3367.327 | 3367.648 | 5 | 3 | 2 | 2 | 2 | 0 | |
veriT+viten | 0 | 2 | 2478.312 | 2478.324 | 2 | 0 | 2 | 5 | 2 | 0 | |
Alt-Ergo | 0 | 2 | 3593.873 | 3056.142 | 2 | 0 | 2 | 5 | 2 | 0 | |
SMTInterpol-fixedn | 0 | 2 | 4801.761 | 4801.158 | 2 | 0 | 2 | 5 | 4 | 0 | |
SMTInterpol | 0 | 2 | 4801.778 | 4801.194 | 2 | 0 | 2 | 5 | 4 | 0 | |
veriT | 0 | 2 | 6000.025 | 6000.029 | 2 | 0 | 2 | 5 | 5 | 0 | |
CVC4 | 0 | 2 | 6000.034 | 6000.034 | 2 | 0 | 2 | 5 | 5 | 0 | |
Vampire | 0 | 2 | 6005.8 | 6002.504 | 2 | 0 | 2 | 5 | 5 | 0 | |
UltimateEliminator+MathSAT | 0 | 0 | 23.435 | 16.44 | 0 | 0 | 0 | 7 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
z3n | 0 | 3 | 2769.332 | 2769.422 | 3 | 3 | 0 | 4 | 2 | 0 |
2018-Z3n | 0 | 3 | 3367.254 | 3367.575 | 3 | 3 | 0 | 4 | 2 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 16.753 | 11.765 | 0 | 0 | 0 | 7 | 0 | 0 |
veriT+viten | 0 | 0 | 2478.292 | 2478.294 | 0 | 0 | 0 | 7 | 2 | 0 |
Alt-Ergo | 0 | 0 | 3593.821 | 3056.089 | 0 | 0 | 0 | 7 | 2 | 0 |
SMTInterpol-fixedn | 0 | 0 | 4800.636 | 4800.408 | 0 | 0 | 0 | 7 | 4 | 0 |
SMTInterpol | 0 | 0 | 4800.64 | 4800.434 | 0 | 0 | 0 | 7 | 4 | 0 |
veriT | 0 | 0 | 6000.0 | 6000.0 | 0 | 0 | 0 | 7 | 5 | 0 |
Vampire | 0 | 0 | 6000.0 | 6000.0 | 0 | 0 | 0 | 7 | 5 | 0 |
CVC4 | 0 | 0 | 6000.0 | 6000.0 | 0 | 0 | 0 | 7 | 5 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
veriT | 0 | 2 | 0.025 | 0.029 | 2 | 0 | 2 | 5 | 5 | 0 |
veriT+viten | 0 | 2 | 0.02 | 0.03 | 2 | 0 | 2 | 5 | 2 | 0 |
CVC4 | 0 | 2 | 0.034 | 0.034 | 2 | 0 | 2 | 5 | 5 | 0 |
Alt-Ergo | 0 | 2 | 0.053 | 0.053 | 2 | 0 | 2 | 5 | 2 | 0 |
z3n | 0 | 2 | 0.064 | 0.064 | 2 | 0 | 2 | 5 | 2 | 0 |
2018-Z3n | 0 | 2 | 0.073 | 0.073 | 2 | 0 | 2 | 5 | 2 | 0 |
SMTInterpol-fixedn | 0 | 2 | 1.125 | 0.749 | 2 | 0 | 2 | 5 | 4 | 0 |
SMTInterpol | 0 | 2 | 1.139 | 0.76 | 2 | 0 | 2 | 5 | 4 | 0 |
Vampire | 0 | 2 | 5.8 | 2.504 | 2 | 0 | 2 | 5 | 5 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 6.682 | 4.674 | 0 | 0 | 0 | 7 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
2018-Z3n | 0 | 4 | 95.263 | 95.263 | 4 | 2 | 2 | 3 | 3 | 0 | |
z3n | 0 | 3 | 98.109 | 98.109 | 3 | 1 | 2 | 4 | 4 | 0 | |
veriT+viten | 0 | 2 | 73.499 | 73.509 | 2 | 0 | 2 | 5 | 3 | 0 | |
Alt-Ergo | 0 | 2 | 96.19 | 96.104 | 2 | 0 | 2 | 5 | 4 | 0 | |
SMTInterpol-fixedn | 0 | 2 | 97.761 | 97.158 | 2 | 0 | 2 | 5 | 4 | 0 | |
SMTInterpol | 0 | 2 | 97.778 | 97.194 | 2 | 0 | 2 | 5 | 4 | 0 | |
veriT | 0 | 2 | 120.025 | 120.029 | 2 | 0 | 2 | 5 | 5 | 0 | |
CVC4 | 0 | 2 | 120.034 | 120.034 | 2 | 0 | 2 | 5 | 5 | 0 | |
Vampire | 0 | 2 | 125.8 | 122.504 | 2 | 0 | 2 | 5 | 5 | 0 | |
UltimateEliminator+MathSAT | 0 | 0 | 23.435 | 16.44 | 0 | 0 | 0 | 7 | 0 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.