The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the ABVFPLRA logic in the Single Query Track.
Page generated on 2023-07-06 16:04:53 +0000
Benchmarks: 77 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
---|---|---|---|---|
cvc5 | cvc5 | cvc5 | cvc5 | Bitwuzla |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 49 | 245.538 | 245.459 | 49 | 47 | 2 | 28 | 26 | 0 |
cvc5 | 0 | 38 | 4279.349 | 4294.91 | 38 | 34 | 4 | 39 | 38 | 0 |
Bitwuzla | 0 | 33 | 44.462 | 44.466 | 33 | 32 | 1 | 44 | 0 | 0 |
Bitwuzla Fixedn | 0 | 33 | 44.538 | 44.546 | 33 | 32 | 1 | 44 | 0 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 77 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 49 | 245.538 | 245.459 | 49 | 47 | 2 | 28 | 26 | 0 |
cvc5 | 0 | 38 | 4279.349 | 4294.91 | 38 | 34 | 4 | 39 | 38 | 0 |
Bitwuzla | 0 | 33 | 44.462 | 44.466 | 33 | 32 | 1 | 44 | 0 | 0 |
Bitwuzla Fixedn | 0 | 33 | 44.538 | 44.546 | 33 | 32 | 1 | 44 | 0 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 77 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 47 | 2.928 | 2.838 | 47 | 47 | 0 | 0 | 30 | 26 | 0 |
cvc5 | 0 | 34 | 4268.686 | 4284.236 | 34 | 34 | 0 | 13 | 30 | 38 | 0 |
Bitwuzla | 0 | 32 | 44.388 | 44.392 | 32 | 32 | 0 | 15 | 30 | 0 | 0 |
Bitwuzla Fixedn | 0 | 32 | 44.464 | 44.473 | 32 | 32 | 0 | 15 | 30 | 0 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 47 | 30 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
cvc5 | 0 | 4 | 10.663 | 10.674 | 4 | 0 | 4 | 0 | 73 | 38 | 0 |
2022-z3-4.8.17n | 0 | 2 | 242.611 | 242.621 | 2 | 0 | 2 | 2 | 73 | 26 | 0 |
Bitwuzla Fixedn | 0 | 1 | 0.073 | 0.074 | 1 | 0 | 1 | 3 | 73 | 0 | 0 |
Bitwuzla | 0 | 1 | 0.074 | 0.074 | 1 | 0 | 1 | 3 | 73 | 0 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 4 | 73 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
2022-z3-4.8.17n | 0 | 47 | 2.928 | 2.838 | 47 | 47 | 0 | 30 | 28 | 0 |
Bitwuzla | 0 | 33 | 44.462 | 44.466 | 33 | 32 | 1 | 44 | 0 | 0 |
Bitwuzla Fixedn | 0 | 33 | 44.538 | 44.546 | 33 | 32 | 1 | 44 | 0 | 0 |
cvc5 | 0 | 24 | 12.713 | 12.714 | 24 | 20 | 4 | 53 | 53 | 0 |
UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 77 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.