SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

Biggest Lead Ranking- Incremental Track

Page generated on 2022-08-10 11:18:22 +0000

Winners

Parallel Performance
smtinterpol

Parallel Performance

Solver Correct Score Time Score Division
smtinterpol 2.48905696 3.63425053 QF_NonLinearIntArith
Yices2 1.85578631 2.0099945 QF_LinearIntArith
cvc5 1.48916592 1.240465 Equality+NonLinearArith
cvc5 1.20828761 2.06475977 Equality
Yices2 1.19119048 2.42718976 QF_Equality+Bitvec
smtinterpol 1.11456386 5.84067341 QF_Equality+NonLinearArith
Bitwuzla 1.11455789 24.5839296 QF_FPArith
Bitwuzla 1.06015504 1.1513549 FPArith
cvc5 1.05437693 1.33980095 Bitvec
smtinterpol 1.04804128 1.06683902 QF_Equality+LinearArith
cvc5 1.0241521 8.2626928 Equality+LinearArith
Yices2 1.00493061 0.73911608 QF_Bitvec
OpenSMT 1.00332226 1.30221332 QF_LinearRealArith
UltimateEliminator+MathSAT 1.0 14.65549157 Equality+MachineArith
Yices2 1.0 8.40762978 QF_Equality
cvc5 1.0 1.64129678 Arith

n Non-competing.
e Experimental.