The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_UFLIA logic in the Unsat Core Track.
Page generated on 2021-07-18 17:31:25 +0000
Benchmarks: 16 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
Yices2 | Yices2 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
2020-Yices2-fixedn | 0 | 21 | 0.061 | 0.207 | 0 | 0 |
Yices2 | 0 | 21 | 0.062 | 0.186 | 0 | 0 |
cvc5-uc | 0 | 21 | 0.274 | 0.267 | 0 | 0 |
MathSAT5n | 0 | 21 | 0.299 | 0.304 | 0 | 0 |
z3n | 0 | 21 | 0.364 | 0.376 | 0 | 0 |
2020-z3n | 0 | 21 | 0.605 | 0.606 | 0 | 0 |
SMTInterpol | 0 | 21 | 6.712 | 5.0 | 0 | 0 |
SMTInterpol-remus | 0 | 21 | 10.239 | 6.809 | 0 | 0 |
2020-CVC4-ucn | 0 | 18 | 0.234 | 0.269 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
Yices2 | 0 | 21 | 0.062 | 0.186 | 0 | 0 |
2020-Yices2-fixedn | 0 | 21 | 0.061 | 0.207 | 0 | 0 |
cvc5-uc | 0 | 21 | 0.274 | 0.267 | 0 | 0 |
MathSAT5n | 0 | 21 | 0.299 | 0.304 | 0 | 0 |
z3n | 0 | 21 | 0.364 | 0.376 | 0 | 0 |
2020-z3n | 0 | 21 | 0.605 | 0.606 | 0 | 0 |
SMTInterpol | 0 | 21 | 6.712 | 5.0 | 0 | 0 |
SMTInterpol-remus | 0 | 21 | 10.239 | 6.809 | 0 | 0 |
2020-CVC4-ucn | 0 | 18 | 0.234 | 0.269 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.