The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_ALIA division in the Unsat Core Track.
Page generated on 2020-07-04 11:49:33 +0000
Benchmarks: 30 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
CVC4-uc | CVC4-uc |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
z3n | 0 | 720 | 1.239 | 1.241 | 0 | 0 | |
CVC4-uc | 0 | 677 | 1.17 | 1.182 | 0 | 0 | |
SMTInterpol-fixedn | 0 | 644 | 21.245 | 12.564 | 0 | 0 | |
SMTInterpol | 0 | 644 | 21.28 | 12.582 | 0 | 0 | |
MathSAT5n | 0 | 583 | 0.789 | 0.823 | 0 | 0 | |
Yices2-fixedn | 0 | 564 | 0.155 | 0.38 | 0 | 0 | |
Yices2 | 0 | 564 | 0.162 | 0.526 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
z3n | 0 | 720 | 1.239 | 1.241 | 0 | 0 | |
CVC4-uc | 0 | 677 | 1.17 | 1.182 | 0 | 0 | |
SMTInterpol-fixedn | 0 | 644 | 21.245 | 12.564 | 0 | 0 | |
SMTInterpol | 0 | 644 | 21.28 | 12.582 | 0 | 0 | |
MathSAT5n | 0 | 583 | 0.789 | 0.823 | 0 | 0 | |
Yices2-fixedn | 0 | 564 | 0.155 | 0.38 | 0 | 0 | |
Yices2 | 0 | 564 | 0.162 | 0.526 | 0 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.