The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_AUFNIA logic in the Single Query Track.
Page generated on 2022-08-10 11:17:44 +0000
Benchmarks: 9 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
---|---|---|---|---|
smtinterpol | smtinterpol | smtinterpol | smtinterpol | smtinterpol |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
MathSATn | 0 | 9 | 7.515 | 7.518 | 9 | 2 | 7 | 0 | 0 | 0 |
2020-CVC4n | 0 | 9 | 22.163 | 22.162 | 9 | 2 | 7 | 0 | 0 | 0 |
smtinterpol | 0 | 9 | 50.46 | 17.24 | 9 | 2 | 7 | 0 | 0 | 0 |
cvc5 | 0 | 9 | 155.507 | 151.785 | 9 | 2 | 7 | 0 | 0 | 0 |
z3-4.8.17n | 0 | 8 | 1203.383 | 1203.511 | 8 | 1 | 7 | 1 | 1 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
MathSATn | 0 | 9 | 7.515 | 7.518 | 9 | 2 | 7 | 0 | 0 | 0 |
smtinterpol | 0 | 9 | 50.46 | 17.24 | 9 | 2 | 7 | 0 | 0 | 0 |
2020-CVC4n | 0 | 9 | 22.163 | 22.162 | 9 | 2 | 7 | 0 | 0 | 0 |
cvc5 | 0 | 9 | 155.507 | 151.785 | 9 | 2 | 7 | 0 | 0 | 0 |
z3-4.8.17n | 0 | 8 | 1203.493 | 1203.481 | 8 | 1 | 7 | 1 | 1 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
MathSATn | 0 | 2 | 0.476 | 0.476 | 2 | 2 | 0 | 0 | 7 | 0 | 0 |
smtinterpol | 0 | 2 | 8.308 | 2.92 | 2 | 2 | 0 | 0 | 7 | 0 | 0 |
2020-CVC4n | 0 | 2 | 3.081 | 3.08 | 2 | 2 | 0 | 0 | 7 | 0 | 0 |
cvc5 | 0 | 2 | 16.024 | 16.027 | 2 | 2 | 0 | 0 | 7 | 0 | 0 |
z3-4.8.17n | 0 | 1 | 1202.368 | 1202.367 | 1 | 1 | 0 | 1 | 7 | 1 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
z3-4.8.17n | 0 | 7 | 1.125 | 1.114 | 7 | 0 | 7 | 0 | 2 | 1 | 0 |
MathSATn | 0 | 7 | 7.039 | 7.042 | 7 | 0 | 7 | 0 | 2 | 0 | 0 |
smtinterpol | 0 | 7 | 42.153 | 14.319 | 7 | 0 | 7 | 0 | 2 | 0 | 0 |
2020-CVC4n | 0 | 7 | 19.083 | 19.082 | 7 | 0 | 7 | 0 | 2 | 0 | 0 |
cvc5 | 0 | 7 | 139.483 | 135.758 | 7 | 0 | 7 | 0 | 2 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
MathSATn | 0 | 9 | 7.515 | 7.518 | 9 | 2 | 7 | 0 | 0 | 0 |
smtinterpol | 0 | 9 | 50.46 | 17.24 | 9 | 2 | 7 | 0 | 0 | 0 |
2020-CVC4n | 0 | 9 | 22.163 | 22.162 | 9 | 2 | 7 | 0 | 0 | 0 |
z3-4.8.17n | 0 | 8 | 27.493 | 27.481 | 8 | 1 | 7 | 1 | 1 | 0 |
cvc5 | 0 | 7 | 109.121 | 105.366 | 7 | 2 | 5 | 2 | 2 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.