The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_LRA division in the Incremental Track.
Page generated on 2020-07-04 11:47:56 +0000
Benchmarks: 10 Time Limit: 1200 seconds Memory Limit: 60 GB
Parallel Performance |
---|
Yices2 incremental |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Unsolved | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|---|
2018-MathSAT (incremental)n | 0 | 1237 | 8473.321 | 8472.909 | 278 | 5 | 0 | |
MathSAT5n | 0 | 978 | 11011.887 | 11011.732 | 537 | 9 | 0 | |
Yices2-fixed incrementaln | 0 | 913 | 12000.0 | 12000.0 | 602 | 10 | 0 | |
Yices2 incremental | 0 | 911 | 12000.0 | 12000.0 | 604 | 10 | 0 | |
OpenSMT | 0 | 884 | 10184.185 | 10184.466 | 631 | 7 | 0 | |
SMTInterpol | 0 | 657 | 10263.431 | 10202.764 | 858 | 8 | 0 | |
SMTInterpol-fixedn | 0 | 656 | 10264.963 | 10201.607 | 859 | 8 | 0 | |
CVC4-inc | 0 | 628 | 12000.0 | 12000.0 | 887 | 10 | 0 | |
z3n | 0 | 614 | 12000.0 | 12000.0 | 901 | 10 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.