The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_AUFBV division in the Unsat Core Track.
Page generated on 2020-07-04 11:49:33 +0000
Benchmarks: 34 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
Yices2 | Yices2 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Yices2 | 0 | 18131 | 4599.237 | 4600.158 | 3 | 0 | |
Yices2-fixedn | 0 | 18131 | 4606.823 | 4607.686 | 3 | 0 | |
z3n | 0 | 16283 | 6052.012 | 6051.845 | 4 | 0 | |
Bitwuzla | 0 | 15283 | 248.481 | 248.683 | 0 | 0 | |
Bitwuzla-fixedn | 0 | 15283 | 255.862 | 249.5 | 0 | 0 | |
CVC4-uc | 0 | 15248 | 10855.945 | 10858.898 | 9 | 0 | |
MathSAT5n | 0 | 192 | 2.76 | 2.794 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Yices2 | 0 | 18131 | 4599.787 | 4599.948 | 3 | 0 | |
Yices2-fixedn | 0 | 18131 | 4607.353 | 4607.586 | 3 | 0 | |
z3n | 0 | 16283 | 6052.982 | 6051.645 | 4 | 0 | |
Bitwuzla | 0 | 15283 | 248.481 | 248.683 | 0 | 0 | |
Bitwuzla-fixedn | 0 | 15283 | 255.862 | 249.5 | 0 | 0 | |
CVC4-uc | 0 | 15248 | 10858.445 | 10858.448 | 9 | 0 | |
MathSAT5n | 0 | 192 | 2.76 | 2.794 | 0 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.