The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_UFFP division in the Single Query Track.
Page generated on 2020-07-04 11:46:59 +0000
Benchmarks: 2 Time Limit: 1200 seconds Memory Limit: 60 GB
| Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
|---|---|---|---|---|
| Bitwuzla | Bitwuzla | — | Bitwuzla | Bitwuzla |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Abstained | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 2 | 0.231 | 0.231 | 2 | 0 | 2 | 0 | 0 | 0 | |
| Bitwuzla-fixedn | 0 | 2 | 0.234 | 0.234 | 2 | 0 | 2 | 0 | 0 | 0 | |
| COLIBRI | 0 | 2 | 0.638 | 0.659 | 2 | 0 | 2 | 0 | 0 | 0 | |
| CVC4 | 0 | 2 | 0.855 | 0.854 | 2 | 0 | 2 | 0 | 0 | 0 | |
| MathSAT5n | 0 | 2 | 1.128 | 1.129 | 2 | 0 | 2 | 0 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Abstained | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 2 | 0.231 | 0.231 | 2 | 0 | 2 | 0 | 0 | 0 | |
| Bitwuzla-fixedn | 0 | 2 | 0.234 | 0.234 | 2 | 0 | 2 | 0 | 0 | 0 | |
| COLIBRI | 0 | 2 | 0.638 | 0.659 | 2 | 0 | 2 | 0 | 0 | 0 | |
| CVC4 | 0 | 2 | 0.855 | 0.854 | 2 | 0 | 2 | 0 | 0 | 0 | |
| MathSAT5n | 0 | 2 | 1.128 | 1.129 | 2 | 0 | 2 | 0 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|
| COLIBRI | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 2 | 0 | 0 |
| CVC4 | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 2 | 0 | 0 |
| Bitwuzla | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 2 | 0 | 0 |
| MathSAT5n | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 2 | 0 | 0 |
| Bitwuzla-fixedn | 0 | 0 | 0.0 | 0.0 | 0 | 0 | 0 | 2 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 2 | 0.231 | 0.231 | 2 | 0 | 2 | 0 | 0 | 0 |
| Bitwuzla-fixedn | 0 | 2 | 0.234 | 0.234 | 2 | 0 | 2 | 0 | 0 | 0 |
| COLIBRI | 0 | 2 | 0.638 | 0.659 | 2 | 0 | 2 | 0 | 0 | 0 |
| CVC4 | 0 | 2 | 0.855 | 0.854 | 2 | 0 | 2 | 0 | 0 | 0 |
| MathSAT5n | 0 | 2 | 1.128 | 1.129 | 2 | 0 | 2 | 0 | 0 | 0 |
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Abstained | Timeout | Memout |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 2 | 0.231 | 0.231 | 2 | 0 | 2 | 0 | 0 | 0 | |
| Bitwuzla-fixedn | 0 | 2 | 0.234 | 0.234 | 2 | 0 | 2 | 0 | 0 | 0 | |
| COLIBRI | 0 | 2 | 0.638 | 0.659 | 2 | 0 | 2 | 0 | 0 | 0 | |
| CVC4 | 0 | 2 | 0.855 | 0.854 | 2 | 0 | 2 | 0 | 0 | 0 | |
| MathSAT5n | 0 | 2 | 1.128 | 1.129 | 2 | 0 | 2 | 0 | 0 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.