SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

Biggest Lead Ranking- Incremental Track

Page generated on 2023-07-06 16:05:24 +0000

Winners

Parallel Performance
Bitwuzla

Parallel Performance

Solver Correct Score Time Score Division
Bitwuzla 2.77167277 0.0474286 Equality+MachineArith
SMTInterpol 2.40420235 4.07216308 QF_NonLinearIntArith
Bitwuzla 1.64455569 0.10057209 FPArith
cvc5 1.52807539 1.05908599 Equality+NonLinearArith
cvc5 1.46551818 1.96865199 Equality
Yices2 1.18725751 1.70623311 QF_LinearIntArith
Bitwuzla 1.14714445 43.34195912 QF_FPArith
SMTInterpol 1.05892894 0.9725261 QF_Equality+LinearArith
OpenSMT 1.05181918 1.33800247 QF_LinearRealArith
cvc5 1.04245994 0.03777217 QF_Equality+NonLinearArith
cvc5 1.03331699 0.2654385 Bitvec
cvc5 1.01793463 5.36797651 Equality+LinearArith
Bitwuzla 1.00533428 0.60574509 QF_Equality+Bitvec
Bitwuzla 1.00317615 0.7042866 QF_Bitvec
cvc5 1.0 2.83615456 Arith
cvc5 1.0 2.28907187 QF_Equality
cvc5 0.93154034 0.23699811 QF_Equality+Bitvec+Arith

n Non-competing.
e Experimental.