SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_ALIA (Single Query Track)

Competition results for the QF_ALIA division in the Single Query Track.

Page generated on 2020-07-04 11:46:58 +0000

Benchmarks: 116
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
Yices2Yices2Yices2 Yices2 Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2019-Yices 2.6.2n 0 116 56.223 55.591116546200 0
Yices2-fixedn 0 116 68.043 68.548116546200 0
Yices2 0 116 69.148 68.544116546200 0
MathSAT5n 0 116 102.499 102.516116546200 0
z3n 0 116 621.208 621.279116546200 0
SMTInterpol 0 116 1036.445 573.192116546200 0
SMTInterpol-fixedn 0 116 1065.72 575.918116546200 0
CVC4 0 115 5085.81 5083.448115546111 0
Alt-Ergo 0 51 70269.95 68598.862510516556 0
veriT 0 6 4.564 4.6056061100 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2019-Yices 2.6.2n 0 11656.22355.591116546200 0
Yices2 0 11669.14868.544116546200 0
Yices2-fixedn 0 11668.04368.548116546200 0
MathSAT5n 0 116102.499102.516116546200 0
SMTInterpol 0 1161036.445573.192116546200 0
SMTInterpol-fixedn 0 1161065.72575.918116546200 0
z3n 0 116621.208621.279116546200 0
CVC4 0 1155085.845083.408115546111 0
Alt-Ergo 0 5278373.5266856.456520526451 0
veriT 0 64.5644.6056061100 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Yices 2.6.2n 0 5450.18749.18154540620 0
Yices2-fixedn 0 5461.33861.37454540620 0
Yices2 0 5462.47861.50754540620 0
MathSAT5n 0 5472.11772.12754540620 0
SMTInterpol 0 54501.103209.54254540620 0
SMTInterpol-fixedn 0 54526.189210.94854540620 0
z3n 0 54587.124587.18654540620 0
CVC4 0 542134.2052131.44854540621 0
veriT 0 02.5032.5050001160 0
Alt-Ergo 0 062758.23353942.88100011651 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Yices 2.6.2n 0 626.0366.4162062540 0
Yices2 0 626.677.03762062540 0
Yices2-fixedn 0 626.7057.17462062540 0
MathSAT5n 0 6230.38130.38862062540 0
z3n 0 6234.08334.09262062540 0
SMTInterpol 0 62535.342363.6562062540 0
SMTInterpol-fixedn 0 62539.532364.9762062540 0
CVC4 0 612951.6352951.9661061551 0
Alt-Ergo 0 5215615.28712913.575520526451 0
veriT 0 62.0612.16061100 0

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2019-Yices 2.6.2n 0 11656.22355.591116546200 0
Yices2 0 11669.14868.544116546200 0
Yices2-fixedn 0 11668.04368.548116546200 0
MathSAT5n 0 11598.24198.257115536211 0
SMTInterpol 0 110769.866395.522110535766 0
SMTInterpol-fixedn 0 110796.708397.289110535766 0
z3n 0 107425.375425.413107456299 0
CVC4 0 95617.103613.8859541542121 0
Alt-Ergo 0 482152.7061716.808480486865 0
veriT 0 64.5644.6056061100 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.