The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the FPLRA logic in the Single Query Track.
Page generated on 2023-07-06 16:04:54 +0000
Benchmarks: 41 Time Limit: 1200 seconds Memory Limit: 60 GB
| Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
|---|---|---|---|---|
| Bitwuzla | Bitwuzla | Bitwuzla | — | Bitwuzla |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 37 | 31.207 | 31.211 | 37 | 37 | 0 | 4 | 4 | 0 |
| Bitwuzla Fixedn | 0 | 37 | 31.733 | 31.744 | 37 | 37 | 0 | 4 | 4 | 0 |
| 2022-Bitwuzlan | 0 | 27 | 810.086 | 810.146 | 27 | 27 | 0 | 14 | 14 | 0 |
| cvc5 | 0 | 23 | 140.884 | 140.879 | 23 | 23 | 0 | 18 | 17 | 0 |
| UltimateEliminator+MathSAT | 0 | 14 | 94.99 | 70.968 | 14 | 14 | 0 | 27 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 37 | 31.207 | 31.211 | 37 | 37 | 0 | 4 | 4 | 0 |
| Bitwuzla Fixedn | 0 | 37 | 31.733 | 31.744 | 37 | 37 | 0 | 4 | 4 | 0 |
| 2022-Bitwuzlan | 0 | 27 | 810.086 | 810.146 | 27 | 27 | 0 | 14 | 14 | 0 |
| cvc5 | 0 | 23 | 140.884 | 140.879 | 23 | 23 | 0 | 18 | 17 | 0 |
| UltimateEliminator+MathSAT | 0 | 14 | 94.99 | 70.968 | 14 | 14 | 0 | 27 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 37 | 31.207 | 31.211 | 37 | 37 | 0 | 0 | 4 | 4 | 0 |
| Bitwuzla Fixedn | 0 | 37 | 31.733 | 31.744 | 37 | 37 | 0 | 0 | 4 | 4 | 0 |
| 2022-Bitwuzlan | 0 | 27 | 810.086 | 810.146 | 27 | 27 | 0 | 10 | 4 | 14 | 0 |
| cvc5 | 0 | 23 | 140.884 | 140.879 | 23 | 23 | 0 | 14 | 4 | 17 | 0 |
| UltimateEliminator+MathSAT | 0 | 14 | 94.99 | 70.968 | 14 | 14 | 0 | 23 | 4 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 2022-Bitwuzlan | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 41 | 14 | 0 |
| cvc5 | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 41 | 17 | 0 |
| Bitwuzla | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 41 | 4 | 0 |
| UltimateEliminator+MathSAT | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 41 | 0 | 0 |
| Bitwuzla Fixedn | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 41 | 4 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 37 | 31.207 | 31.211 | 37 | 37 | 0 | 4 | 4 | 0 |
| Bitwuzla Fixedn | 0 | 37 | 31.733 | 31.744 | 37 | 37 | 0 | 4 | 4 | 0 |
| 2022-Bitwuzlan | 0 | 25 | 27.014 | 27.02 | 25 | 25 | 0 | 16 | 16 | 0 |
| cvc5 | 0 | 22 | 9.272 | 9.262 | 22 | 22 | 0 | 19 | 18 | 0 |
| UltimateEliminator+MathSAT | 0 | 14 | 94.99 | 70.968 | 14 | 14 | 0 | 27 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.