The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_AUFNIA logic in the Single Query Track.
Page generated on 2023-07-06 16:04:54 +0000
Benchmarks: 9 Time Limit: 1200 seconds Memory Limit: 60 GB
| Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
|---|---|---|---|---|
| SMTInterpol | SMTInterpol | SMTInterpol | SMTInterpol | SMTInterpol |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|
| 2020-CVC4n | 0 | 9 | 20.607 | 20.605 | 9 | 2 | 7 | 0 | 0 | 0 |
| SMTInterpol | 0 | 9 | 48.726 | 16.76 | 9 | 2 | 7 | 0 | 0 | 0 |
| cvc5 | 0 | 9 | 49.831 | 49.872 | 9 | 2 | 7 | 0 | 0 | 0 |
| Yices2 | 0 | 9 | 282.853 | 235.868 | 9 | 2 | 7 | 0 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|
| SMTInterpol | 0 | 9 | 48.726 | 16.76 | 9 | 2 | 7 | 0 | 0 | 0 |
| 2020-CVC4n | 0 | 9 | 20.607 | 20.605 | 9 | 2 | 7 | 0 | 0 | 0 |
| cvc5 | 0 | 9 | 49.831 | 49.872 | 9 | 2 | 7 | 0 | 0 | 0 |
| Yices2 | 0 | 9 | 282.853 | 235.868 | 9 | 2 | 7 | 0 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 2020-CVC4n | 0 | 2 | 2.188 | 2.188 | 2 | 2 | 0 | 0 | 7 | 0 | 0 |
| SMTInterpol | 0 | 2 | 8.044 | 2.854 | 2 | 2 | 0 | 0 | 7 | 0 | 0 |
| cvc5 | 0 | 2 | 4.509 | 4.514 | 2 | 2 | 0 | 0 | 7 | 0 | 0 |
| Yices2 | 0 | 2 | 95.038 | 95.041 | 2 | 2 | 0 | 0 | 7 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|---|
| SMTInterpol | 0 | 7 | 40.682 | 13.906 | 7 | 0 | 7 | 0 | 2 | 0 | 0 |
| 2020-CVC4n | 0 | 7 | 18.419 | 18.417 | 7 | 0 | 7 | 0 | 2 | 0 | 0 |
| cvc5 | 0 | 7 | 45.322 | 45.358 | 7 | 0 | 7 | 0 | 2 | 0 | 0 |
| Yices2 | 0 | 7 | 187.815 | 140.827 | 7 | 0 | 7 | 0 | 2 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|
| SMTInterpol | 0 | 9 | 48.726 | 16.76 | 9 | 2 | 7 | 0 | 0 | 0 |
| 2020-CVC4n | 0 | 9 | 20.607 | 20.605 | 9 | 2 | 7 | 0 | 0 | 0 |
| cvc5 | 0 | 9 | 49.831 | 49.872 | 9 | 2 | 7 | 0 | 0 | 0 |
| Yices2 | 0 | 4 | 1.577 | 1.577 | 4 | 0 | 4 | 5 | 5 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.