SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_BV (Model Validation Track)

Competition results for the QF_BV division in the Model Validation Track.

Page generated on 2020-07-04 11:50:14 +0000

Benchmarks: 5752
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
BitwuzlaBitwuzla

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
Bitwuzla 0 5731 83067.669 82945.70121 0
Bitwuzla-fixedn 0 5730 82743.866 82630.75522 0
Yices2-fixed Model Validationn 0 5728 83587.749 83460.7424 0
2019-Boolectorn 0 5717 92007.406 92692.14235 0
CVC4-mv 0 5693 181021.074 180813.23251 4
STP + MergeSAT 0 5675 181439.148 181945.58375 0
STP + CMS 0 5634 256210.435 175332.376116 0
z3n 1 5552 443696.534 443476.345199 0
Yices2 Model Validation 43 5600 178499.348 178415.635109 0
MathSAT5-mvn 104* 5321 545033.862 544920.002324 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
Bitwuzla 0 573183070.36982945.12121 0
Bitwuzla-fixedn 0 573082748.28682629.74522 0
Yices2-fixed Model Validationn 0 572883591.02983459.7924 0
2019-Boolectorn 0 571792323.19692690.33235 0
STP + CMS 0 5716317862.725119317.81734 0
CVC4-mv 0 5693181038.304180811.16251 4
STP + MergeSAT 0 5675181450.708181942.90375 0
z3n 1 5552443732.334443467.755199 0
Yices2 Model Validation 43 5600178503.118178412.985109 0
MathSAT5-mvn 104* 5321545105.872544905.352324 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.
* The error score is caused by MathSAT not giving full models (syntactic problems). It does not indicate an unsoundness.