SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_UFLRA (Single Query Track)

Competition results for the QF_UFLRA division in the Single Query Track.

Page generated on 2020-07-04 11:46:59 +0000

Benchmarks: 432
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
Yices2Yices2Yices2 Yices2 Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2019-SMTInterpoln 0 431 3979.857 2865.78443125018111 0
Yices2 0 430 2435.753 2436.89443024918122 0
Yices2-fixedn 0 430 2435.76 2437.42943024918122 0
veriT 0 430 2475.953 2476.36743024918122 0
MathSAT5n 0 430 2511.952 2512.6743024918122 0
CVC4 0 430 2994.438 2995.11143024918122 0
SMTInterpol 0 430 4241.553 3214.89743024918122 0
SMTInterpol-fixedn 0 430 4266.487 3218.19243024918122 0
z3n 0 412 24281.095 24283.5334122311812020 0
Alt-Ergo 0 170 138940.3 74346.897170017026232 8

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2019-SMTInterpoln 0 4313979.8572865.78443125018111 0
Yices2 0 4302435.9232436.76443024918122 0
Yices2-fixedn 0 4302435.932437.32943024918122 0
veriT 0 4302476.2632476.29743024918122 0
MathSAT5n 0 4302512.5622512.6243024918122 0
CVC4 0 4302995.0682994.99143024918122 0
SMTInterpol 0 4304241.5533214.89743024918122 0
SMTInterpol-fixedn 0 4304266.4873218.19243024918122 0
z3n 0 41224282.58524282.6534122311812020 0
Alt-Ergo 0 172150111.7965655.268172017226019 8

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-SMTInterpoln 0 2502064.8651399.94825025001821 0
Yices2 0 2491229.8941230.33824924901832 0
Yices2-fixedn 0 2491229.8981230.67124924901832 0
veriT 0 2491268.0231268.07324924901832 0
MathSAT5n 0 2491301.5491301.58724924901832 0
CVC4 0 2491759.0281759.00524924901832 0
SMTInterpol 0 2492363.3341764.0324924901832 0
SMTInterpol-fixedn 0 2492370.6271764.50724924901832 0
z3n 0 23123043.08323043.132231231020120 0
Alt-Ergo 0 0103020.92644808.93600043219 8

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Yices2 0 1816.0296.42618101812512 0
Yices2-fixedn 0 1816.0326.65918101812512 0
veriT 0 1818.248.22418101812512 0
MathSAT5n 0 18111.01311.03318101812512 0
CVC4 0 18136.0435.98618101812512 0
z3n 0 18139.50139.521181018125120 0
SMTInterpol 0 181678.22250.86718101812512 0
SMTInterpol-fixedn 0 181695.86253.68518101812512 0
2019-SMTInterpoln 0 181714.992265.83518101812511 0
Alt-Ergo 0 17245890.86419646.332172017226019 8

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Yices2 0 43083.92384.76443024918122 0
Yices2-fixedn 0 43083.9385.32943024918122 0
veriT 0 430124.263124.29743024918122 0
MathSAT5n 0 429157.561157.61842924818133 0
SMTInterpol-fixedn 0 4271523.489645.98842724618155 0
SMTInterpol 0 4271506.36648.5442724618155 0
2019-SMTInterpoln 0 4271532.293656.30642724618155 0
CVC4 0 426243.183243.07242624518166 0
z3n 0 409656.627656.6714092281812323 0
Alt-Ergo 0 6613248.0319138.71766066366307 8

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.