SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_Equality+Bitvec (Model Validation Track)

Competition results for the QF_Equality+Bitvec division in the Model Validation Track.

Page generated on 2021-07-18 17:31:50 +0000

Benchmarks: 375
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics: This division is experimental. Solvers are only ranked by performance, but no winner is selected.

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
Yices2 model-validation 0 374 2185.384 2186.3291 0
Bitwuzla 0 374 2616.587 2574.3221 0
MathSAT5n 0 362 16905.111 16911.37813 0
cvc5-mv 0 361 26215.782 26219.14714 0
z3-mvn 0 358 23531.293 23525.06517 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
Yices2 model-validation 0 3742185.6242186.3091 0
Bitwuzla 0 3742616.9572574.3021 0
MathSAT5n 0 36216909.55116910.78813 0
cvc5-mv 0 36126221.66226218.60714 0
z3-mvn 0 35823534.36323524.38517 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.