The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_AUFBV logic in the Single Query Track.
Page generated on 2023-07-06 16:04:54 +0000
Benchmarks: 38 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
---|---|---|---|---|
Bitwuzla | Bitwuzla | Bitwuzla | Bitwuzla | Yices2 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
Bitwuzla Fixedn | 0 | 33 | 2473.644 | 2474.409 | 33 | 17 | 16 | 5 | 5 | 0 |
Bitwuzla | 0 | 33 | 2474.619 | 2475.215 | 33 | 17 | 16 | 5 | 5 | 0 |
Yices2 | 0 | 27 | 2527.759 | 2528.181 | 27 | 13 | 14 | 11 | 11 | 0 |
Z3-Owl Fixedn | 0 | 20 | 3428.861 | 3428.657 | 20 | 7 | 13 | 18 | 18 | 0 |
2022-Bitwuzlan | 0 | 19 | 411.792 | 411.828 | 19 | 10 | 9 | 19 | 4 | 0 |
cvc5 | 0 | 12 | 70.146 | 70.147 | 12 | 4 | 8 | 26 | 26 | 0 |
UltimateIntBlastingWrapper+SMTInterpol | 2 | 6 | 330.107 | 161.707 | 6 | 2 | 4 | 32 | 26 | 0 |
Z3-Owl | 6 | 25 | 1934.103 | 1934.628 | 25 | 14 | 11 | 13 | 7 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
Bitwuzla Fixedn | 0 | 33 | 2473.644 | 2474.409 | 33 | 17 | 16 | 5 | 5 | 0 |
Bitwuzla | 0 | 33 | 2474.619 | 2475.215 | 33 | 17 | 16 | 5 | 5 | 0 |
Yices2 | 0 | 27 | 2527.759 | 2528.181 | 27 | 13 | 14 | 11 | 11 | 0 |
Z3-Owl Fixedn | 0 | 20 | 3428.861 | 3428.657 | 20 | 7 | 13 | 18 | 18 | 0 |
2022-Bitwuzlan | 0 | 19 | 411.792 | 411.828 | 19 | 10 | 9 | 19 | 4 | 0 |
cvc5 | 0 | 12 | 70.146 | 70.147 | 12 | 4 | 8 | 26 | 26 | 0 |
UltimateIntBlastingWrapper+SMTInterpol | 2 | 6 | 330.107 | 161.707 | 6 | 2 | 4 | 32 | 25 | 0 |
Z3-Owl | 6 | 25 | 1934.103 | 1934.628 | 25 | 14 | 11 | 13 | 7 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
Bitwuzla | 0 | 17 | 1901.87 | 1902.393 | 17 | 17 | 0 | 0 | 21 | 5 | 0 |
Bitwuzla Fixedn | 0 | 17 | 1907.658 | 1908.207 | 17 | 17 | 0 | 0 | 21 | 5 | 0 |
Yices2 | 0 | 13 | 2378.198 | 2378.61 | 13 | 13 | 0 | 4 | 21 | 11 | 0 |
2022-Bitwuzlan | 0 | 10 | 369.653 | 369.684 | 10 | 10 | 0 | 7 | 21 | 4 | 0 |
Z3-Owl Fixedn | 0 | 7 | 2157.823 | 2158.373 | 7 | 7 | 0 | 10 | 21 | 18 | 0 |
cvc5 | 0 | 4 | 24.277 | 24.282 | 4 | 4 | 0 | 13 | 21 | 26 | 0 |
UltimateIntBlastingWrapper+SMTInterpol | 2 | 2 | 215.687 | 130.919 | 2 | 2 | 0 | 15 | 21 | 25 | 0 |
Z3-Owl | 6 | 14 | 1023.525 | 1023.716 | 14 | 14 | 0 | 3 | 21 | 7 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
Bitwuzla Fixedn | 0 | 16 | 565.987 | 566.202 | 16 | 0 | 16 | 1 | 21 | 5 | 0 |
Bitwuzla | 0 | 16 | 572.749 | 572.822 | 16 | 0 | 16 | 1 | 21 | 5 | 0 |
Yices2 | 0 | 14 | 149.562 | 149.571 | 14 | 0 | 14 | 3 | 21 | 11 | 0 |
Z3-Owl Fixedn | 0 | 13 | 1271.038 | 1270.284 | 13 | 0 | 13 | 4 | 21 | 18 | 0 |
Z3-Owl | 0 | 11 | 910.577 | 910.912 | 11 | 0 | 11 | 6 | 21 | 7 | 0 |
2022-Bitwuzlan | 0 | 9 | 42.139 | 42.144 | 9 | 0 | 9 | 8 | 21 | 4 | 0 |
cvc5 | 0 | 8 | 45.869 | 45.865 | 8 | 0 | 8 | 9 | 21 | 26 | 0 |
UltimateIntBlastingWrapper+SMTInterpol | 0 | 4 | 114.42 | 30.788 | 4 | 0 | 4 | 13 | 21 | 25 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
Yices2 | 0 | 20 | 31.506 | 31.522 | 20 | 8 | 12 | 18 | 18 | 0 |
Bitwuzla | 0 | 16 | 60.621 | 60.628 | 16 | 7 | 9 | 22 | 22 | 0 |
Bitwuzla Fixedn | 0 | 16 | 61.205 | 61.227 | 16 | 7 | 9 | 22 | 22 | 0 |
2022-Bitwuzlan | 0 | 15 | 18.948 | 18.949 | 15 | 7 | 8 | 23 | 8 | 0 |
cvc5 | 0 | 11 | 32.273 | 32.274 | 11 | 4 | 7 | 27 | 27 | 0 |
Z3-Owl Fixedn | 0 | 8 | 52.145 | 51.107 | 8 | 2 | 6 | 30 | 30 | 0 |
UltimateIntBlastingWrapper+SMTInterpol | 0 | 5 | 167.179 | 49.375 | 5 | 1 | 4 | 33 | 31 | 0 |
Z3-Owl | 3 | 10 | 49.882 | 50.095 | 10 | 4 | 6 | 28 | 25 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.