The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_AUFLIA division as of Thu Jul 7 07:24:34 GMT
Benchmarks in this division : 1009
Sequential Performance | Parallel Performance |
---|---|
Yices2 | Yices2 |
Solver | Error Score | Correct Score | avg. CPU time |
---|---|---|---|
CVC4 | 0.000 | 1009.000 | 4.101 |
MathSat5n | 0.000 | 1009.000 | 2.522 |
SMTInterpol | 0.000 | 1009.000 | 2.333 |
Yices2 | 0.000 | 1009.000 | 0.039 |
veriT-dev | 0.000 | 66.447 | 0.012 |
z3n | 0.000 | 1009.000 | 0.126 |
Solver | Error Score | Correct Score | avg. CPU time | avg. WALL time | Unsolved |
---|---|---|---|---|---|
CVC4 | 0.000 | 1009.000 | 4.101 | 4.098 | 0 |
MathSat5n | 0.000 | 1009.000 | 2.522 | 2.520 | 0 |
SMTInterpol | 0.000 | 1009.000 | 2.333 | 1.390 | 0 |
Yices2 | 0.000 | 1009.000 | 0.039 | 0.043 | 0 |
veriT-dev | 0.000 | 66.447 | 0.012 | 0.016 | 994 |
z3n | 0.000 | 1009.000 | 0.126 | 0.124 | 0 |
n. Non-competitive.
1. Scores are computed according to Section 7 of the rules.