SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_LIA (Single Query Track)

Competition results for the QF_LIA division in the Single Query Track.

Page generated on 2020-07-04 11:46:59 +0000

Benchmarks: 2508
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
CVC4CVC4CVC4 SMTInterpol Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2019-Par4n 0 2447 93732.847 80352.6172447137610716161 0
MathSAT5n 0 2329 443942.69 443796.202232913091020179179 0
CVC4 0 2290 318190.742 317946.291229012561034218217 0
z3n 0 2233 519173.25 519200.23622331311922275275 0
Yices2 0 2198 410092.801 410107.013219811811017310310 0
Yices2-fixedn 0 2197 410250.569 410171.974219711801017311311 0
SMTInterpol 0 2170 555671.409 527233.967217010971073338338 0
SMTInterpol-fixedn 0 2168 555846.38 527576.243216810951073340340 0
veriT 0 819 1433613.127 1433714.08381957424516891152 1

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2019-Par4n 0 2459103682.29774071.7982459138410754949 0
MathSAT5n 0 2329443971.89443789.002232913091020179179 0
CVC4 0 2290318208.962317939.421229012561034218217 0
z3n 0 2233519201.44519189.75622331311922275275 0
Yices2 0 2198410115.181410099.193219811811017310310 0
Yices2-fixedn 0 2197410271.249410163.744219711801017311311 0
SMTInterpol 0 2170555671.409527233.967217010971073338338 0
SMTInterpol-fixedn 0 2170555863.97527552.913217010971073338338 0
veriT 0 8191433703.6471433672.18381957424516891152 1

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Par4n 0 138451796.30731039.904138413840112449 0
z3n 0 1311215944.195215929.511311131101197275 0
MathSAT5n 0 1309200590.763200464.6021309130901199179 0
CVC4 0 1256206924.201206786.281256125601252217 0
Yices2 0 1181295917.05295908.8921181118101327310 0
Yices2-fixedn 0 1180296110.789296014.2191180118001328311 0
SMTInterpol 0 1097454690.446440009.6941097109701411338 0
SMTInterpol-fixedn 0 1097454844.083440201.9751097109701411338 0
veriT 0 574683786.095683777.036574574019341152 1

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Par4n 0 107536285.9927431.894107501075143349 0
SMTInterpol 0 107385380.96371624.2731073010731435338 0
SMTInterpol-fixedn 0 107385419.88771750.9381073010731435338 0
CVC4 0 103495684.76195553.1421034010341474217 0
MathSAT5n 0 1020227781.127227724.41020010201488179 0
Yices2-fixedn 0 101798560.4698549.5251017010171491311 0
Yices2 0 101798598.13198590.3011017010171491310 0
z3n 0 922287657.245287660.24792209221586275 0
veriT 0 245734317.552734295.147245024522631152 1

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2019-Par4n 0 24057636.1044497.818240513451060103103 0
Yices2 0 205713494.27913474.497205710571000451451 0
Yices2-fixedn 0 205713505.26213484.435205710571000451451 0
CVC4 0 179726411.21326223.8541797994803711711 0
z3n 0 152428434.14828386.2781524864660984984 0
MathSAT5n 0 140829959.77729920.814140885055811001100 0
SMTInterpol 0 121645067.93637042.005121661560112921292 0
SMTInterpol-fixedn 0 121344972.82637049.903121361260112951295 0
veriT 0 70333101.61533066.38870350020318051304 1

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.