SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Previous Competitions
SMT-LIB

SMT-COMP 2019

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_BV (Model Validation Track (experimental))

Competition results for the QF_BV division in the Model Validation Track (experimental).

Page generated on 2019-07-23 17:57:57 +0000

Benchmarks: 7191
Time Limit: 2400 seconds
Memory Limit: 60 GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
Boolector 0 7171 161311.749 161334.21919 0
CVC4-mv 0 7152 232895.987 232608.29131 3
Yices 2.6.2 CaDiCal/SMT-LIB2 Models 0 7143 202444.731 202365.7947 0
Minkeyrink Solver 0 7142 239971.427 239870.73248 0
Yices 2.6.2 Model Validation 0 7129 282734.621 282739.25761 0
Yices 2.6.2 New Bvsolver with SMT2 Models 0 7128 241766.842 241786.16662 0
Minkeyrink Solver MT 0 7103 369834.416 253321.64787 0
Yices 2.6.2 Cryptominisat/SMT-LIB2 Models 0 7083 374426.859 374118.103105 0
STP-incremental 0 7009 676626.088 677275.285179 0
STP-mt 0 0 893242.449 692213.598254 0

Parallel Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
Boolector 0 7171 161315.079 161333.58919 0
CVC4-mv 0 7152 232909.657 232606.84131 3
Minkeyrink Solver MT 0 7151 506451.066 203284.05239 0
Yices 2.6.2 CaDiCal/SMT-LIB2 Models 0 7143 202455.451 202363.9447 0
Minkeyrink Solver 0 7142 239984.177 239869.27248 0
Yices 2.6.2 Model Validation 0 7129 282747.331 282736.84761 0
Yices 2.6.2 New Bvsolver with SMT2 Models 0 7128 241779.532 241783.66662 0
Yices 2.6.2 Cryptominisat/SMT-LIB2 Models 0 7083 374452.909 374113.933105 0
STP-incremental 0 7009 676666.938 677268.215179 0
STP-mt 0 0 1122106.179 592689.298166 0

n Non-competing.