SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_Bitvec (Model Validation Track)

Competition results for the QF_Bitvec division in the Model Validation Track.

Page generated on 2021-07-18 17:31:50 +0000

Benchmarks: 7268
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
BitwuzlaBitwuzla

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
2020-Bitwuzlan 0 7225 109751.631 109567.72943 0
Bitwuzla 0 7224 110131.843 109705.23244 0
Yices2 model-validation 0 7190 156904.835 156753.01178 0
STP 0 7108 344396.11 235149.159159 0
cvc5-mv 0 6950 529129.192 528686.431310 4
z3-mvn 0 6894 640218.869 639758.58302 0
MathSAT5n 0 6720 709692.7 709492.783433 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
2020-Bitwuzlan 0 7225109758.181109565.98943 0
Bitwuzla 0 7224110138.173109703.78244 0
Yices2 model-validation 0 7190156909.795156750.68178 0
STP 0 7179401150.49186270.59488 0
cvc5-mv 0 6950529193.012528672.241310 4
z3-mvn 0 6894640279.459639745.88302 0
MathSAT5n 0 6720709799.65709472.473433 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.