The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_Bitvec division in the Model Validation Track.
Page generated on 2021-07-18 17:31:50 +0000
Benchmarks: 7268 Time Limit: 1200 seconds Memory Limit: 60 GB
Logics:Sequential Performance | Parallel Performance |
---|---|
Bitwuzla | Bitwuzla |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
2020-Bitwuzlan | 0 | 7225 | 109751.631 | 109567.729 | 43 | 0 | |
Bitwuzla | 0 | 7224 | 110131.843 | 109705.232 | 44 | 0 | |
Yices2 model-validation | 0 | 7190 | 156904.835 | 156753.011 | 78 | 0 | |
STP | 0 | 7108 | 344396.11 | 235149.159 | 159 | 0 | |
cvc5-mv | 0 | 6950 | 529129.192 | 528686.431 | 310 | 4 | |
z3-mvn | 0 | 6894 | 640218.869 | 639758.58 | 302 | 0 | |
MathSAT5n | 0 | 6720 | 709692.7 | 709492.783 | 433 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
2020-Bitwuzlan | 0 | 7225 | 109758.181 | 109565.989 | 43 | 0 | |
Bitwuzla | 0 | 7224 | 110138.173 | 109703.782 | 44 | 0 | |
Yices2 model-validation | 0 | 7190 | 156909.795 | 156750.681 | 78 | 0 | |
STP | 0 | 7179 | 401150.49 | 186270.594 | 88 | 0 | |
cvc5-mv | 0 | 6950 | 529193.012 | 528672.241 | 310 | 4 | |
z3-mvn | 0 | 6894 | 640279.459 | 639745.88 | 302 | 0 | |
MathSAT5n | 0 | 6720 | 709799.65 | 709472.473 | 433 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.