The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_LIA division in the Model Validation Track.
Page generated on 2020-07-04 11:50:14 +0000
Benchmarks: 1532 Time Limit: 1200 seconds Memory Limit: 60 GB
This division is experimental. Solvers are only ranked by performance, but no winner is selected.
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
z3n | 0 | 1461 | 188635.889 | 188590.757 | 70 | 1 | |
CVC4-mv | 0 | 1413 | 171782.588 | 171520.789 | 116 | 0 | |
Yices2 Model Validation | 0 | 1293 | 318640.804 | 318586.617 | 239 | 0 | |
Yices2-fixed Model Validationn | 0 | 1292 | 318817.204 | 318835.381 | 240 | 0 | |
SMTInterpol-fixedn | 0 | 1273 | 404680.659 | 389779.138 | 259 | 0 | |
SMTInterpol | 0 | 1272 | 404643.459 | 389786.135 | 260 | 0 | |
MathSAT5-mvn | 14* | 1440 | 176506.816 | 176433.952 | 78 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
z3n | 0 | 1461 | 188641.519 | 188588.467 | 70 | 1 | |
CVC4-mv | 0 | 1413 | 171792.048 | 171517.539 | 116 | 0 | |
Yices2 Model Validation | 0 | 1293 | 318653.504 | 318579.807 | 239 | 0 | |
Yices2-fixed Model Validationn | 0 | 1292 | 318829.934 | 318829.741 | 240 | 0 | |
SMTInterpol-fixedn | 0 | 1275 | 404712.029 | 389760.068 | 257 | 0 | |
SMTInterpol | 0 | 1274 | 404650.729 | 389736.485 | 258 | 0 | |
MathSAT5-mvn | 14* | 1440 | 176516.546 | 176431.022 | 78 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.
* The error score is caused by MathSAT not giving full models (syntactic problems). It does not indicate an unsoundness.