SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Competitions by Year

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_BV (Model Validation Track)

Competition results for the QF_BV division in the Model Validation Track.

Page generated on 2020-07-04 11:50:14 +0000

Benchmarks: 5752
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
BitwuzlaBitwuzla

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
Bitwuzla 0 5731 83067.669 82945.70121 0
Bitwuzla-fixedn 0 5730 82743.866 82630.75522 0
Yices2-fixed Model Validationn 0 5728 83587.749 83460.7424 0
2019-Boolectorn 0 5717 92007.406 92692.14235 0
CVC4-mv 0 5693 181021.074 180813.23251 4
STP + MergeSAT 0 5675 181439.148 181945.58375 0
STP + CMS 0 5634 256210.435 175332.376116 0
z3n 1 5552 443696.534 443476.345199 0
Yices2 Model Validation 43 5600 178499.348 178415.635109 0
MathSAT5-mvn 104* 5321 545033.862 544920.002324 0

Parallel Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
Bitwuzla 0 5731 83070.369 82945.12121 0
Bitwuzla-fixedn 0 5730 82748.286 82629.74522 0
Yices2-fixed Model Validationn 0 5728 83591.029 83459.7924 0
2019-Boolectorn 0 5717 92323.196 92690.33235 0
STP + CMS 0 5716 317862.725 119317.81734 0
CVC4-mv 0 5693 181038.304 180811.16251 4
STP + MergeSAT 0 5675 181450.708 181942.90375 0
z3n 1 5552 443732.334 443467.755199 0
Yices2 Model Validation 43 5600 178503.118 178412.985109 0
MathSAT5-mvn 104* 5321 545105.872 544905.352324 0

n Non-competing.
* The error score is caused by MathSAT not giving full models (syntactic problems). It does not indicate an unsoundness.