The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the UFLRA logic in the Unsat Core Track.
Page generated on 2021-07-18 17:31:25 +0000
Benchmarks: 10 Time Limit: 1200 seconds Memory Limit: 60 GB
| Sequential Performance | Parallel Performance |
|---|---|
| cvc5-uc | cvc5-uc |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
|---|---|---|---|---|---|---|
| z3n | 0 | 16 | 0.162 | 0.182 | 0 | 0 |
| cvc5-uc | 0 | 16 | 0.199 | 0.194 | 0 | 0 |
| 2020-z3n | 0 | 16 | 0.332 | 0.346 | 0 | 0 |
| SMTInterpol | 0 | 16 | 6.317 | 4.238 | 0 | 0 |
| SMTInterpol-remus | 0 | 16 | 220.278 | 186.155 | 0 | 0 |
| 2020-CVC4-ucn | 0 | 14 | 0.174 | 0.265 | 0 | 0 |
| UltimateEliminator+MathSAT | 0 | 0 | 48.313 | 28.916 | 0 | 0 |
| Vampire | 0 | 0 | 2400.84 | 2400.834 | 2 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
|---|---|---|---|---|---|---|
| z3n | 0 | 16 | 0.162 | 0.182 | 0 | 0 |
| cvc5-uc | 0 | 16 | 0.199 | 0.194 | 0 | 0 |
| 2020-z3n | 0 | 16 | 0.332 | 0.346 | 0 | 0 |
| SMTInterpol | 0 | 16 | 6.317 | 4.238 | 0 | 0 |
| SMTInterpol-remus | 0 | 16 | 220.278 | 186.155 | 0 | 0 |
| 2020-CVC4-ucn | 0 | 14 | 0.174 | 0.265 | 0 | 0 |
| Vampire | 0 | 2 | 3651.03 | 921.51 | 0 | 0 |
| UltimateEliminator+MathSAT | 0 | 0 | 48.313 | 28.916 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.