The International Satisfiability Modulo Theories (SMT) Competition.
    Home
    Introduction
    Benchmark Submission
    Publications
    SMT-LIB
    Previous Editions
  
Competition results for the QF_Equality+Bitvec division in the Model Validation Track.
Page generated on 2022-08-10 11:19:11 +0000
Benchmarks: 375 Time Limit: 1200 seconds Memory Limit: 60 GB
Logics:| Sequential Performance | Parallel Performance | 
|---|---|
| Bitwuzla | Bitwuzla | 
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout | 
|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 373 | 3639.682 | 3634.177 | 2 | 0 | |
| 2021-Yices2 model-validationn | 0 | 373 | 4749.131 | 4750.812 | 2 | 0 | |
| Yices2 | 0 | 373 | 4752.854 | 4748.789 | 2 | 0 | |
| z3-4.8.17n | 0 | 363 | 16246.942 | 16237.222 | 12 | 0 | |
| cvc5 | 0 | 362 | 24687.63 | 24658.414 | 13 | 0 | |
| MathSATn | 0 | 359 | 20483.716 | 20488.646 | 16 | 0 | 
| Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout | 
|---|---|---|---|---|---|---|---|
| Bitwuzla | 0 | 373 | 3640.252 | 3634.037 | 2 | 0 | |
| Yices2 | 0 | 373 | 4753.204 | 4748.719 | 2 | 0 | |
| 2021-Yices2 model-validationn | 0 | 373 | 4749.651 | 4750.682 | 2 | 0 | |
| z3-4.8.17n | 0 | 363 | 16248.792 | 16236.562 | 12 | 0 | |
| cvc5 | 0 | 362 | 24693.82 | 24657.914 | 13 | 0 | |
| MathSATn | 0 | 359 | 20486.966 | 20487.776 | 16 | 0 | 
  
    n Non-competing.
  
    Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.