The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the FPLRA logic in the Single Query Track.
Page generated on 2022-08-10 11:17:44 +0000
Benchmarks: 87 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
---|---|---|---|---|
Bitwuzla | Bitwuzla | Bitwuzla | Bitwuzla | Bitwuzla |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
z3-4.8.17n | 0 | 70 | 22636.463 | 22612.763 | 70 | 55 | 15 | 17 | 17 | 0 |
Bitwuzla | 0 | 58 | 37766.316 | 37771.674 | 58 | 43 | 15 | 29 | 29 | 0 |
cvc5 | 0 | 50 | 42180.236 | 42176.345 | 50 | 35 | 15 | 37 | 35 | 0 |
2021-cvc5n | 0 | 50 | 42246.502 | 42252.259 | 50 | 35 | 15 | 37 | 35 | 0 |
UltimateEliminator+MathSAT | 0 | 36 | 560.237 | 402.55 | 36 | 23 | 13 | 51 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
z3-4.8.17n | 0 | 70 | 22640.473 | 22611.983 | 70 | 55 | 15 | 17 | 17 | 0 |
Bitwuzla | 0 | 58 | 37770.246 | 37770.714 | 58 | 43 | 15 | 29 | 29 | 0 |
cvc5 | 0 | 50 | 42183.816 | 42175.155 | 50 | 35 | 15 | 37 | 35 | 0 |
2021-cvc5n | 0 | 50 | 42250.532 | 42250.549 | 50 | 35 | 15 | 37 | 35 | 0 |
UltimateEliminator+MathSAT | 0 | 36 | 560.237 | 402.55 | 36 | 23 | 13 | 51 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
z3-4.8.17n | 0 | 55 | 13050.175 | 13021.481 | 55 | 55 | 0 | 10 | 22 | 17 | 0 |
Bitwuzla | 0 | 43 | 29368.602 | 29369.066 | 43 | 43 | 0 | 22 | 22 | 29 | 0 |
cvc5 | 0 | 35 | 33757.341 | 33748.662 | 35 | 35 | 0 | 30 | 22 | 35 | 0 |
2021-cvc5n | 0 | 35 | 33824.49 | 33824.501 | 35 | 35 | 0 | 30 | 22 | 35 | 0 |
UltimateEliminator+MathSAT | 0 | 23 | 422.995 | 307.409 | 23 | 23 | 0 | 42 | 22 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
Bitwuzla | 0 | 15 | 1.644 | 1.648 | 15 | 0 | 15 | 0 | 72 | 29 | 0 |
2021-cvc5n | 0 | 15 | 26.042 | 26.048 | 15 | 0 | 15 | 0 | 72 | 35 | 0 |
cvc5 | 0 | 15 | 26.475 | 26.493 | 15 | 0 | 15 | 0 | 72 | 35 | 0 |
z3-4.8.17n | 0 | 15 | 1190.298 | 1190.502 | 15 | 0 | 15 | 0 | 72 | 17 | 0 |
UltimateEliminator+MathSAT | 0 | 13 | 88.203 | 58.715 | 13 | 0 | 13 | 2 | 72 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
Bitwuzla | 0 | 53 | 841.678 | 841.685 | 53 | 38 | 15 | 34 | 34 | 0 |
z3-4.8.17n | 0 | 50 | 1154.321 | 1154.252 | 50 | 45 | 5 | 37 | 37 | 0 |
cvc5 | 0 | 49 | 946.357 | 937.688 | 49 | 34 | 15 | 38 | 36 | 0 |
2021-cvc5n | 0 | 49 | 952.956 | 952.96 | 49 | 34 | 15 | 38 | 36 | 0 |
UltimateEliminator+MathSAT | 0 | 36 | 560.237 | 402.55 | 36 | 23 | 13 | 51 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.