The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_ADT+BitVec division in the Model Validation Track.
Page generated on 2023-07-06 16:06:00 +0000
Benchmarks: 5251 Time Limit: 1200 seconds Memory Limit: 60 GB
Logics: This division is experimental. Solvers are only ranked by performance, but no winner is selected.
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Bitwuzla Fixedn | 0 | 5213 | 7751.025 | 7790.875 | 3 | 0 | |
Bitwuzla | 0 | 5213 | 7765.231 | 7812.831 | 3 | 0 | |
cvc5 | 0 | 4839 | 128277.483 | 128210.699 | 400 | 1 | |
Yices2 | 0 | 7 | 2599.256 | 2599.96 | 9 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Bitwuzla Fixedn | 0 | 5213 | 7751.025 | 7790.875 | 3 | 0 | |
Bitwuzla | 0 | 5213 | 7765.231 | 7812.831 | 3 | 0 | |
cvc5 | 0 | 4839 | 128277.483 | 128210.699 | 400 | 1 | |
Yices2 | 0 | 7 | 2599.256 | 2599.96 | 9 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.