The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_ALIA division as of Thu Jul 7 07:24:34 GMT
Benchmarks in this division: 139
Time Limit: 1200s
| Sequential Performance | Parallel Performance |
|---|---|
| Yices2 | Yices2 |
| Solver | Error Score | Correct Score | avg. CPU time |
|---|---|---|---|
| CVC4 | 0.000 | 137.590 | 34.567 |
| MathSat5n | 0.000 | 139.000 | 2.621 |
| SMTInterpol | 0.000 | 139.000 | 6.508 |
| Yices2 | 0.000 | 139.000 | 0.235 |
| veriT-dev | 0.000 | 20.566 | 0.047 |
| z3n | 0.000 | 125.676 | 242.180 |
| Solver | Error Score | Correct Score | avg. CPU time | avg. WALL time | Unsolved |
|---|---|---|---|---|---|
| CVC4 | 0.000 | 137.590 | 34.580 | 34.563 | 1 |
| MathSat5n | 0.000 | 139.000 | 2.621 | 2.545 | 0 |
| SMTInterpol | 0.000 | 139.000 | 6.508 | 3.910 | 0 |
| Yices2 | 0.000 | 139.000 | 0.235 | 0.239 | 0 |
| veriT-dev | 0.000 | 20.566 | 0.047 | 0.048 | 123 |
| z3n | 0.000 | 125.676 | 242.302 | 242.175 | 6 |
n. Non-competitive.
1. Scores are computed according to Section 7 of the rules.