The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_ALIA logic in the Unsat Core Track.
Page generated on 2021-07-18 17:31:25 +0000
Benchmarks: 30 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
SMTInterpol | SMTInterpol |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
2020-z3n | 0 | 720 | 1.246 | 1.247 | 0 | 0 |
SMTInterpol | 0 | 713 | 27.78 | 14.403 | 0 | 0 |
z3n | 0 | 712 | 0.841 | 0.843 | 0 | 0 |
2020-CVC4-ucn | 0 | 613 | 1.255 | 1.258 | 0 | 0 |
SMTInterpol-remus | 0 | 596 | 19641.985 | 18761.797 | 3 | 0 |
cvc5-uc | 0 | 576 | 2.32 | 2.306 | 0 | 0 |
MathSAT5n | 0 | 544 | 0.8 | 0.801 | 0 | 0 |
Yices2 | 0 | 533 | 0.155 | 0.376 | 0 | 0 |
2020-Yices2-fixedn | 0 | 533 | 0.156 | 0.567 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
2020-z3n | 0 | 720 | 1.246 | 1.247 | 0 | 0 |
SMTInterpol | 0 | 713 | 27.78 | 14.403 | 0 | 0 |
SMTInterpol-remus | 0 | 713 | 19653.505 | 18582.687 | 0 | 0 |
z3n | 0 | 712 | 0.841 | 0.843 | 0 | 0 |
2020-CVC4-ucn | 0 | 613 | 1.255 | 1.258 | 0 | 0 |
cvc5-uc | 0 | 576 | 2.32 | 2.306 | 0 | 0 |
MathSAT5n | 0 | 544 | 0.8 | 0.801 | 0 | 0 |
Yices2 | 0 | 533 | 0.155 | 0.376 | 0 | 0 |
2020-Yices2-fixedn | 0 | 533 | 0.156 | 0.567 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.