The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_ALIA logic in the Unsat Core Track.
Page generated on 2022-08-10 11:18:51 +0000
Benchmarks: 30 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
smtinterpol | smtinterpol |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
z3-4.8.17n | 0 | 654 | 0.936 | 0.888 | 0 | 0 |
smtinterpol | 0 | 633 | 32.659 | 17.041 | 0 | 0 |
Yices2 | 0 | 594 | 0.156 | 0.379 | 0 | 0 |
MathSATn | 0 | 553 | 0.769 | 0.777 | 0 | 0 |
2021-MathSAT5n | 0 | 553 | 0.792 | 0.796 | 0 | 0 |
cvc5 | 0 | 0 | 2.451 | 2.437 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Timeout | Memout |
---|---|---|---|---|---|---|
z3-4.8.17n | 0 | 654 | 0.936 | 0.888 | 0 | 0 |
smtinterpol | 0 | 633 | 32.659 | 17.041 | 0 | 0 |
Yices2 | 0 | 594 | 0.156 | 0.379 | 0 | 0 |
MathSATn | 0 | 553 | 0.769 | 0.777 | 0 | 0 |
2021-MathSAT5n | 0 | 553 | 0.792 | 0.796 | 0 | 0 |
cvc5 | 0 | 0 | 2.451 | 2.437 | 0 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.