SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

Largest Contribution Ranking - Unsat Core Track

Page generated on 2020-07-04 11:49:39 +0000

Winners

Sequential Performance Parallel Performance
CVC4-uc CVC4-uc

Sequential Performance

Solver Correct Score Time Score Division
CVC4-uc 0.06083489 0.34079307 AUFLIRA
CVC4-uc 0.05827922 0.1338736 UFLIA
CVC4-uc 0.04005245 0.05764296 UF
CVC4-uc 0.01516674 0.03249943 AUFLIA
CVC4-uc 0.00991001 0.00039505 LIA
CVC4-uc 0.00376889 2.5e-07 QF_AUFLIA
Yices2 0.0027606 0.00019705 QF_AX
Bitwuzla 0.00165592 0.03003442 QF_BV
Yices2 0.00162197 0.00108525 QF_LIA
Bitwuzla 0.00131906 0.01475387 QF_ABV
Yices2 0.0010159 0.00139522 QF_LRA
CVC4-uc 0.00071746 0.00147252 UFIDL
SMTInterpol 0.00067835 0.00384134 QF_UF
Yices2 0.00025858 0.00101831 QF_AUFBV
CVC4-uc 0.00016535 0.00021098 QF_UFLRA
CVC4-uc 0.00014086 0.00334217 QF_UFBV
SMTInterpol 6.139e-05 -0.00348929 UFLRA
CVC4-uc 4.793e-05 0.00102471 QF_ALIA
Yices2 0.0 0.00074869 QF_UFLIA
Yices2 0.0 0.0004564 QF_UFIDL

Parallel Performance

Solver Correct Score Time Score Division
CVC4-uc 0.06083489 0.34055887 AUFLIRA
CVC4-uc 0.05827922 0.13389345 UFLIA
CVC4-uc 0.04005245 0.05763022 UF
CVC4-uc 0.01516674 0.03244898 AUFLIA
CVC4-uc 0.00991001 0.00039473 LIA
CVC4-uc 0.00376889 2.8e-07 QF_AUFLIA
Yices2 0.0027606 0.00019596 QF_AX
Bitwuzla 0.00165592 0.03004478 QF_BV
Yices2 0.00162197 0.00099558 QF_LIA
Bitwuzla 0.00131906 0.01475142 QF_ABV
Yices2 0.0010159 0.00127942 QF_LRA
CVC4-uc 0.00071746 0.00147252 UFIDL
SMTInterpol 0.00067835 0.00394046 QF_UF
Yices2 0.00025858 0.00101829 QF_AUFBV
CVC4-uc 0.00016535 0.00021098 QF_UFLRA
CVC4-uc 0.00014086 0.00334212 QF_UFBV
SMTInterpol 6.139e-05 -0.00151176 UFLRA
CVC4-uc 4.793e-05 0.0009322 QF_ALIA
Yices2 0.0 0.00060774 QF_UFLIA
Yices2 0.0 0.00010903 QF_UFIDL

n Non-competing.
e Experimental.