The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the ABVFP logic in the Single Query Track.
Page generated on 2023-07-06 16:04:53 +0000
Benchmarks: 60 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
---|---|---|---|---|
cvc5 | cvc5 | Bitwuzla | cvc5 | Bitwuzla |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 43 | 9.392 | 9.31 | 43 | 40 | 3 | 17 | 8 | 0 |
cvc5 | 0 | 26 | 2964.142 | 2998.094 | 26 | 23 | 3 | 34 | 34 | 0 |
Bitwuzla Fixedn | 0 | 24 | 32.288 | 32.298 | 24 | 24 | 0 | 36 | 0 | 0 |
Bitwuzla | 0 | 24 | 32.663 | 32.675 | 24 | 24 | 0 | 36 | 0 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 60 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 43 | 9.392 | 9.31 | 43 | 40 | 3 | 17 | 8 | 0 |
cvc5 | 0 | 26 | 2964.142 | 2998.094 | 26 | 23 | 3 | 34 | 34 | 0 |
Bitwuzla Fixedn | 0 | 24 | 32.288 | 32.298 | 24 | 24 | 0 | 36 | 0 | 0 |
Bitwuzla | 0 | 24 | 32.663 | 32.675 | 24 | 24 | 0 | 36 | 0 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 60 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 40 | 9.267 | 9.191 | 40 | 40 | 0 | 5 | 15 | 8 | 0 |
Bitwuzla Fixedn | 0 | 24 | 32.288 | 32.298 | 24 | 24 | 0 | 21 | 15 | 0 | 0 |
Bitwuzla | 0 | 24 | 32.663 | 32.675 | 24 | 24 | 0 | 21 | 15 | 0 | 0 |
cvc5 | 0 | 23 | 2954.942 | 2988.89 | 23 | 23 | 0 | 22 | 15 | 34 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 45 | 15 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 3 | 0.125 | 0.12 | 3 | 0 | 3 | 0 | 57 | 8 | 0 |
cvc5 | 0 | 3 | 9.2 | 9.204 | 3 | 0 | 3 | 0 | 57 | 34 | 0 |
Bitwuzla | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 3 | 57 | 0 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 3 | 57 | 0 | 0 |
Bitwuzla Fixedn | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 3 | 57 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 43 | 9.392 | 9.31 | 43 | 40 | 3 | 17 | 8 | 0 |
Bitwuzla | 0 | 23 | 1.328 | 1.329 | 23 | 23 | 0 | 37 | 1 | 0 |
Bitwuzla Fixedn | 0 | 23 | 1.333 | 1.335 | 23 | 23 | 0 | 37 | 1 | 0 |
cvc5 | 0 | 19 | 13.187 | 13.188 | 19 | 16 | 3 | 41 | 41 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 60 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.