The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_Bitvec division in the Unsat Core Track.
Page generated on 2022-08-10 11:18:51 +0000
Benchmarks: 2887
Time Limit: 1200 seconds
Memory Limit: 60 GB
Sequential Performance | Parallel Performance |
---|---|
Bitwuzla | Bitwuzla |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Bitwuzla | 0 | 2301687 | 247133.739 | 246989.583 | 139 | 0 | |
2020-Bitwuzla-fixedn | 0 | 2217052 | 240620.225 | 240479.946 | 134 | 0 | |
Yices2 | 0 | 1905566 | 350259.717 | 350163.36 | 264 | 0 | |
z3-4.8.17n | 0 | 1633136 | 1321726.085 | 1321888.485 | 1010 | 0 | |
cvc5 | 0 | 422993 | 2494764.535 | 2494983.9 | 1998 | 0 | |
MathSATn | 0 | 0 | 345.647 | 346.012 | 0 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Bitwuzla | 0 | 2301687 | 247155.949 | 246984.343 | 139 | 0 | |
2020-Bitwuzla-fixedn | 0 | 2217052 | 240639.985 | 240475.036 | 134 | 0 | |
Yices2 | 0 | 1905566 | 350303.297 | 350151.92 | 264 | 0 | |
z3-4.8.17n | 0 | 1633136 | 1321872.315 | 1321846.265 | 1010 | 0 | |
cvc5 | 0 | 422993 | 2494994.765 | 2494910.93 | 1998 | 0 | |
MathSATn | 0 | 0 | 345.647 | 346.012 | 0 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.