SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2019

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

Biggest Lead Ranking- Incremental Track

Page generated on 2019-07-23 17:57:24 +0000

Winners

Parallel Performance
CVC4-inc

Parallel Performance

Solver Correct Score Time Score Division
CVC4-inc 23087.0 0.00176848 ANIA
CVC4-inc 3129.0 0.00464822 AUFNIRA
CVC4-inc 4.00793257 0.00461174 QF_ANIA
CVC4-inc 3.28915663 0.00339779 ABVFP
CVC4-inc 2.58073329 0.12378406 BV
CVC4-inc 1.3556116 0.0031942 BVFP
Yices 2.6.2 Incremental 1.33282675 4.23575783 QF_AUFBV
CVC4-inc 1.31463961 18.34734075 UFLRA
Yices 2.6.2 Incremental 1.29305913 1.0 QF_LRA
MathSAT-na-ext 1.01315789 1.06477707 QF_AUFBVNIA
Yices 2.6.2 Incremental 1.00566493 1.71829501 QF_LIA
Yices 2.6.2 Incremental 1.00286123 3.29848695 QF_ABV
Yices 2.6.2 Incremental 1.00071925 2.54794281 QF_UFLRA
SMTInterpol 1.00060635 0.86818961 QF_UFLIA
SMTInterpol 1.00031516 1.55191614 LIA
Yices 2.6.2 Incremental 1.00024257 0.85044099 QF_BV
CVC4-inc 1.00007406 0.67184884 ALIA
SMTInterpol 1.00005091 3.15213188 QF_ALIA
Yices 2.6.2 Incremental 1.0 5.64947027 QF_AUFLIA
Yices 2.6.2 Incremental 1.0 5.34883945 QF_UF
MathSAT-default 1.0 1.03030252 QF_UFNIA
Yices 2.6.2 Incremental 1.0 1.02781673 QF_UFBV
MathSAT-default 1.0 1.02251275 QF_NIA

n Non-competing.
e Experimental.