SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

Largest Contribution Ranking - Unsat Core Track

Page generated on 2022-08-10 11:18:58 +0000

Winners

Sequential Performance Parallel Performance
cvc5 cvc5

Sequential Performance

Solver Correct Score Time Score Division
cvc5 0.0521587 0.08599203 Equality+NonLinearArith
cvc5 0.04115421 0.40739055 Equality+LinearArith
Bitwuzla 0.01043474 0.01544786 QF_Equality+Bitvec
Bitwuzla 0.00991293 0.01353864 QF_Bitvec
cvc5 0.00894203 0.00201093 Arith
Yices2 0.00838525 0.00228754 QF_Equality+LinearArith
cvc5 0.0056515 0.04355741 Equality
Yices2 0.00170943 0.00467974 QF_LinearIntArith
Yices2 0.00110382 0.0071915 QF_Equality
Yices2 0.00043344 0.00057085 QF_LinearRealArith

Parallel Performance

Solver Correct Score Time Score Division
cvc5 0.04957972 0.08791668 Equality+NonLinearArith
cvc5 0.03817682 0.41072164 Equality+LinearArith
Bitwuzla 0.01043474 0.01545614 QF_Equality+Bitvec
Bitwuzla 0.00991293 0.01354821 QF_Bitvec
cvc5 0.00894203 0.00200388 Arith
Yices2 0.00804904 0.00210806 QF_Equality+LinearArith
Vampire 0.0056803 0.04779834 Equality
Yices2 0.00170942 0.00423363 QF_LinearIntArith
Yices2 0.00110382 0.00713069 QF_Equality
Yices2 0.00043226 0.00049105 QF_LinearRealArith

n Non-competing.
e Experimental.