The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_ANIA logic in the Single Query Track.
Page generated on 2022-08-10 11:17:44 +0000
Benchmarks: 102 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
---|---|---|---|---|
smtinterpol | smtinterpol | smtinterpol | smtinterpol | smtinterpol |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
smtinterpol | 0 | 88 | 13512.89 | 13079.368 | 88 | 73 | 15 | 14 | 10 | 0 |
MathSATn | 0 | 80 | 20903.711 | 20905.804 | 80 | 70 | 10 | 22 | 15 | 0 |
2020-CVC4n | 0 | 76 | 34151.047 | 34156.684 | 76 | 64 | 12 | 26 | 26 | 0 |
cvc5 | 0 | 66 | 45994.385 | 45985.108 | 66 | 55 | 11 | 36 | 36 | 0 |
z3-4.8.17n | 0 | 46 | 69812.397 | 69826.927 | 46 | 32 | 14 | 56 | 56 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
smtinterpol | 0 | 88 | 13512.89 | 13079.368 | 88 | 73 | 15 | 14 | 10 | 0 |
MathSATn | 0 | 80 | 20904.941 | 20905.134 | 80 | 70 | 10 | 22 | 15 | 0 |
2020-CVC4n | 0 | 76 | 34155.547 | 34155.964 | 76 | 64 | 12 | 26 | 26 | 0 |
cvc5 | 0 | 66 | 46001.845 | 45983.558 | 66 | 55 | 11 | 36 | 36 | 0 |
z3-4.8.17n | 0 | 46 | 69824.247 | 69824.597 | 46 | 32 | 14 | 56 | 56 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
smtinterpol | 0 | 73 | 702.269 | 475.712 | 73 | 73 | 0 | 3 | 26 | 10 | 0 |
MathSATn | 0 | 70 | 376.589 | 376.626 | 70 | 70 | 0 | 6 | 26 | 15 | 0 |
2020-CVC4n | 0 | 64 | 16564.46 | 16564.731 | 64 | 64 | 0 | 12 | 26 | 26 | 0 |
cvc5 | 0 | 55 | 27379.314 | 27360.971 | 55 | 55 | 0 | 21 | 26 | 36 | 0 |
z3-4.8.17n | 0 | 32 | 55220.369 | 55220.703 | 32 | 32 | 0 | 44 | 26 | 56 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
smtinterpol | 0 | 15 | 12810.621 | 12603.656 | 15 | 0 | 15 | 11 | 76 | 10 | 0 |
z3-4.8.17n | 0 | 14 | 14603.878 | 14603.894 | 14 | 0 | 14 | 12 | 76 | 56 | 0 |
2020-CVC4n | 0 | 12 | 17591.087 | 17591.234 | 12 | 0 | 12 | 14 | 76 | 26 | 0 |
cvc5 | 0 | 11 | 18622.531 | 18622.587 | 11 | 0 | 11 | 15 | 76 | 36 | 0 |
MathSATn | 0 | 10 | 20528.353 | 20528.507 | 10 | 0 | 10 | 16 | 76 | 15 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
smtinterpol | 0 | 82 | 811.012 | 568.832 | 82 | 72 | 10 | 20 | 16 | 0 |
MathSATn | 0 | 73 | 634.74 | 634.753 | 73 | 67 | 6 | 29 | 22 | 0 |
2020-CVC4n | 0 | 61 | 1112.988 | 1112.982 | 61 | 55 | 6 | 41 | 41 | 0 |
cvc5 | 0 | 50 | 1408.733 | 1390.147 | 50 | 45 | 5 | 52 | 52 | 0 |
z3-4.8.17n | 0 | 33 | 1766.118 | 1766.068 | 33 | 21 | 12 | 69 | 69 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.