SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_BV (Incremental Track)

Competition results for the QF_BV division in the Incremental Track.

Page generated on 2020-07-04 11:47:56 +0000

Benchmarks: 1035
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Parallel Performance
Yices2 incremental

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreUnsolvedAbstainedTimeout Memout
Yices2-fixed incrementaln 0 1866920835.30520741.55539511 0
Yices2 incremental 0 1866720923.31420740.38439711 0
2019-Yices 2.6.2 Incrementaln 0 1864321069.4721034.23542112 0
STP + CMS 0 1863619265.60816123.70842811 0
Bitwuzla-fixedn 0 1861419550.61319481.8894508 0
Bitwuzla 0 1860219772.6119630.4884628 0
STP + MergeSAT 0 1858616525.96316446.02747811 0
z3n 0 1856843808.15843725.49149615 0
MathSAT5n 0 1855131162.94831095.71251315 0
CVC4-inc 0 1846660912.7360836.53759811 0
LazyBV2Int 0 14781333740.613333648.3514283258 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.