The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_UFLIA division in the Unsat Core Track.
Page generated on 2020-07-04 11:49:33 +0000
Benchmarks: 16 Time Limit: 1200 seconds Memory Limit: 60 GB
| Sequential Performance | Parallel Performance |
|---|---|
| Yices2 | Yices2 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
|---|---|---|---|---|---|---|---|
| Yices2-fixedn | 0 | 21 | 0.061 | 0.235 | 0 | 0 | |
| Yices2 | 0 | 21 | 0.065 | 0.272 | 0 | 0 | |
| MathSAT5n | 0 | 21 | 0.293 | 0.313 | 0 | 0 | |
| z3n | 0 | 21 | 0.63 | 0.632 | 0 | 0 | |
| SMTInterpol-fixedn | 0 | 21 | 6.3 | 4.835 | 0 | 0 | |
| SMTInterpol | 0 | 21 | 6.315 | 4.737 | 0 | 0 | |
| CVC4-uc | 0 | 18 | 0.223 | 0.295 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
|---|---|---|---|---|---|---|---|
| Yices2-fixedn | 0 | 21 | 0.061 | 0.235 | 0 | 0 | |
| Yices2 | 0 | 21 | 0.065 | 0.272 | 0 | 0 | |
| MathSAT5n | 0 | 21 | 0.293 | 0.313 | 0 | 0 | |
| z3n | 0 | 21 | 0.63 | 0.632 | 0 | 0 | |
| SMTInterpol | 0 | 21 | 6.315 | 4.737 | 0 | 0 | |
| SMTInterpol-fixedn | 0 | 21 | 6.3 | 4.835 | 0 | 0 | |
| CVC4-uc | 0 | 18 | 0.223 | 0.295 | 0 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.