The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_UFLRA logic in the Unsat Core Track.
Page generated on 2021-07-18 17:31:25 +0000
Benchmarks: 101 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
SMTInterpol-remus | SMTInterpol-remus |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
SMTInterpol-remus | 0 | 70 | 2644.232 | 1573.966 | 0 | 0 |
cvc5-uc | 0 | 62 | 1083.162 | 1083.264 | 0 | 0 |
2020-z3n | 0 | 61 | 60.605 | 60.63 | 0 | 0 |
MathSAT5n | 0 | 61 | 105.805 | 105.841 | 0 | 0 |
2020-CVC4-ucn | 0 | 61 | 263.586 | 260.046 | 0 | 0 |
Yices2 | 0 | 58 | 20.726 | 21.067 | 0 | 0 |
2020-Yices2-fixedn | 0 | 58 | 21.1 | 21.523 | 0 | 0 |
z3n | 0 | 58 | 11055.403 | 11056.902 | 4 | 0 |
SMTInterpol | 0 | 57 | 1694.337 | 982.499 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
SMTInterpol-remus | 0 | 70 | 2644.232 | 1573.966 | 0 | 0 |
cvc5-uc | 0 | 62 | 1083.162 | 1083.264 | 0 | 0 |
2020-z3n | 0 | 61 | 60.605 | 60.63 | 0 | 0 |
MathSAT5n | 0 | 61 | 105.805 | 105.841 | 0 | 0 |
2020-CVC4-ucn | 0 | 61 | 263.586 | 260.046 | 0 | 0 |
Yices2 | 0 | 58 | 20.726 | 21.067 | 0 | 0 |
2020-Yices2-fixedn | 0 | 58 | 21.1 | 21.523 | 0 | 0 |
z3n | 0 | 58 | 11055.543 | 11056.812 | 4 | 0 |
SMTInterpol | 0 | 57 | 1694.337 | 982.499 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.