The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_Bitvec division in the Model Validation Track.
Page generated on 2022-08-10 11:19:11 +0000
Benchmarks: 7324 Time Limit: 1200 seconds Memory Limit: 60 GB
Logics:Sequential Performance | Parallel Performance |
---|---|
Bitwuzla | Bitwuzla |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
2020-Bitwuzlan | 0 | 7274 | 135131.97 | 134864.608 | 49 | 0 | |
Bitwuzla | 0 | 7272 | 138254.408 | 138058.655 | 51 | 0 | |
Yices2 | 0 | 7246 | 178045.342 | 177848.427 | 77 | 0 | |
STP | 0 | 7213 | 192246.216 | 192180.224 | 110 | 0 | |
z3-4.8.17n | 0 | 7009 | 571080.815 | 570633.122 | 314 | 0 | |
cvc5 | 0 | 6958 | 566723.606 | 566246.512 | 360 | 3 | |
MathSATn | 0 | 6729 | 756992.123 | 756822.668 | 474 | 2 | |
Z3++BV | 9 | 7068 | 396744.629 | 396609.891 | 163 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
2020-Bitwuzlan | 0 | 7274 | 135139.38 | 134862.608 | 49 | 0 | |
Bitwuzla | 0 | 7272 | 138262.308 | 138056.095 | 51 | 0 | |
Yices2 | 0 | 7246 | 178052.422 | 177845.717 | 77 | 0 | |
STP | 0 | 7213 | 192266.876 | 192176.014 | 110 | 0 | |
z3-4.8.17n | 0 | 7009 | 571123.255 | 570621.912 | 314 | 0 | |
cvc5 | 0 | 6958 | 566815.106 | 566229.812 | 360 | 3 | |
MathSATn | 0 | 6728 | 757092.603 | 756800.388 | 474 | 2 | |
Z3++BV | 9 | 7068 | 396897.859 | 396601.771 | 163 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.