The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_AUFLIA division in the Incremental Track.
Page generated on 2020-07-04 11:47:56 +0000
Benchmarks: 49 Time Limit: 1200 seconds Memory Limit: 60 GB
Parallel Performance |
---|
Yices2 incremental |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Unsolved | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|---|
2018-Yices (incremental)n | 0 | 1455587 | 1219.622 | 1124.07 | 0 | 0 | 0 | |
Yices2 incremental | 0 | 1455587 | 1374.283 | 1270.854 | 0 | 0 | 0 | |
Yices2-fixed incrementaln | 0 | 1455587 | 1372.997 | 1271.143 | 0 | 0 | 0 | |
z3n | 0 | 1428565 | 2345.58 | 2228.202 | 27022 | 1 | 0 | |
SMTInterpol-fixedn | 0 | 932446 | 4171.482 | 3686.384 | 523141 | 1 | 0 | |
SMTInterpol | 0 | 932139 | 4147.248 | 3667.816 | 523448 | 1 | 0 | |
MathSAT5n | 0 | 642413 | 3353.185 | 3277.72 | 813174 | 1 | 0 | |
CVC4-inc | 0 | 295671 | 6395.206 | 6386.471 | 1159916 | 4 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.