The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_LIRA division in the Model Validation Track.
Page generated on 2020-07-04 11:50:14 +0000
Benchmarks: 1 Time Limit: 1200 seconds Memory Limit: 60 GB
This division is experimental. Solvers are only ranked by performance, but no winner is selected.
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
|---|---|---|---|---|---|---|---|
| Yices2-fixed Model Validationn | 0 | 1 | 0.098 | 0.098 | 0 | 0 | |
| Yices2 Model Validation | 0 | 1 | 0.098 | 0.098 | 0 | 0 | |
| z3n | 0 | 1 | 0.343 | 0.344 | 0 | 0 | |
| CVC4-mv | 0 | 1 | 1.357 | 1.357 | 0 | 0 | |
| MathSAT5-mvn | 0 | 1 | 2.52 | 2.523 | 0 | 0 | |
| SMTInterpol | 0 | 1 | 47.578 | 28.592 | 0 | 0 | |
| SMTInterpol-fixedn | 0 | 1 | 49.014 | 29.0 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
|---|---|---|---|---|---|---|---|
| Yices2-fixed Model Validationn | 0 | 1 | 0.098 | 0.098 | 0 | 0 | |
| Yices2 Model Validation | 0 | 1 | 0.098 | 0.098 | 0 | 0 | |
| z3n | 0 | 1 | 0.343 | 0.344 | 0 | 0 | |
| CVC4-mv | 0 | 1 | 1.357 | 1.357 | 0 | 0 | |
| MathSAT5-mvn | 0 | 1 | 2.52 | 2.523 | 0 | 0 | |
| SMTInterpol | 0 | 1 | 47.578 | 28.592 | 0 | 0 | |
| SMTInterpol-fixedn | 0 | 1 | 49.014 | 29.0 | 0 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.