The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_IDL division in the Model Validation Track.
Page generated on 2020-07-04 11:50:14 +0000
Benchmarks: 512 Time Limit: 1200 seconds Memory Limit: 60 GB
This division is experimental. Solvers are only ranked by performance, but no winner is selected.
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Yices2 Model Validation | 0 | 487 | 35389.528 | 35376.161 | 25 | 0 | |
Yices2-fixed Model Validationn | 0 | 487 | 35405.088 | 35393.349 | 25 | 0 | |
z3n | 0 | 478 | 44659.544 | 44679.086 | 29 | 0 | |
CVC4-mv | 0 | 441 | 130578.806 | 130542.963 | 71 | 0 | |
SMTInterpol-fixedn | 0 | 320 | 268434.789 | 264935.925 | 192 | 0 | |
SMTInterpol | 0 | 319 | 268666.114 | 265253.85 | 193 | 0 | |
MathSAT5-mvn | 5* | 334 | 238095.162 | 238118.56 | 173 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Yices2 Model Validation | 0 | 487 | 35391.708 | 35375.521 | 25 | 0 | |
Yices2-fixed Model Validationn | 0 | 487 | 35407.618 | 35392.469 | 25 | 0 | |
z3n | 0 | 478 | 44663.604 | 44678.176 | 29 | 0 | |
CVC4-mv | 0 | 441 | 130590.026 | 130540.473 | 71 | 0 | |
SMTInterpol-fixedn | 0 | 321 | 268441.109 | 264925.145 | 191 | 0 | |
SMTInterpol | 0 | 320 | 268671.694 | 265242.24 | 192 | 0 | |
MathSAT5-mvn | 5* | 334 | 238118.942 | 238111.74 | 173 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.
* The error score is caused by MathSAT not giving full models (syntactic problems). It does not indicate an unsoundness.