The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_UFBV division in the Unsat Core Track.
Page generated on 2020-07-04 11:49:33 +0000
Benchmarks: 300 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
Yices2 | Yices2 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
z3n | 0 | 615580 | 3872.4 | 3874.215 | 3 | 0 | |
Yices2 | 0 | 610474 | 2769.94 | 2771.073 | 2 | 0 | |
Yices2-fixedn | 0 | 610474 | 2772.005 | 2773.189 | 2 | 0 | |
Bitwuzla-fixedn | 0 | 609598 | 34.901 | 35.305 | 0 | 0 | |
Bitwuzla | 0 | 609598 | 35.655 | 36.23 | 0 | 0 | |
CVC4-uc | 0 | 578923 | 9060.305 | 9062.201 | 7 | 0 | |
MathSAT5n | 0 | 0 | 19.927 | 20.049 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
z3n | 0 | 615580 | 3874.01 | 3874.075 | 3 | 0 | |
Yices2 | 0 | 610474 | 2770.12 | 2771.013 | 2 | 0 | |
Yices2-fixedn | 0 | 610474 | 2772.395 | 2772.979 | 2 | 0 | |
Bitwuzla-fixedn | 0 | 609598 | 34.901 | 35.305 | 0 | 0 | |
Bitwuzla | 0 | 609598 | 35.655 | 36.23 | 0 | 0 | |
CVC4-uc | 0 | 578923 | 9063.115 | 9061.731 | 7 | 0 | |
MathSAT5n | 0 | 0 | 19.927 | 20.049 | 0 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.