SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Competitions by Year

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_ALIA (Single Query Track)

Competition results for the QF_ALIA division in the Single Query Track.

Page generated on 2020-07-04 11:46:58 +0000

Benchmarks: 116
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
Yices2Yices2Yices2 Yices2 Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Yices 2.6.2n 0 116 56.223 55.591116546200 0
Yices2-fixedn 0 116 68.043 68.548116546200 0
Yices2 0 116 69.148 68.544116546200 0
MathSAT5n 0 116 102.499 102.516116546200 0
z3n 0 116 621.208 621.279116546200 0
SMTInterpol 0 116 1036.445 573.192116546200 0
SMTInterpol-fixedn 0 116 1065.72 575.918116546200 0
CVC4 0 115 5085.81 5083.448115546111 0
Alt-Ergo 0 51 70269.95 68598.862510516556 0
veriT 0 6 4.564 4.6056061100 0

Parallel Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Yices 2.6.2n 0 116 56.223 55.591116546200 0
Yices2 0 116 69.148 68.544116546200 0
Yices2-fixedn 0 116 68.043 68.548116546200 0
MathSAT5n 0 116 102.499 102.516116546200 0
SMTInterpol 0 116 1036.445 573.192116546200 0
SMTInterpol-fixedn 0 116 1065.72 575.918116546200 0
z3n 0 116 621.208 621.279116546200 0
CVC4 0 115 5085.84 5083.408115546111 0
Alt-Ergo 0 52 78373.52 66856.456520526451 0
veriT 0 6 4.564 4.6056061100 0

SAT Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Yices 2.6.2n 0 54 50.187 49.18154540620 0
Yices2-fixedn 0 54 61.338 61.37454540620 0
Yices2 0 54 62.478 61.50754540620 0
MathSAT5n 0 54 72.117 72.12754540620 0
SMTInterpol 0 54 501.103 209.54254540620 0
SMTInterpol-fixedn 0 54 526.189 210.94854540620 0
z3n 0 54 587.124 587.18654540620 0
CVC4 0 54 2134.205 2131.44854540621 0
veriT 0 0 2.503 2.5050001160 0
Alt-Ergo 0 0 62758.233 53942.88100011651 0

UNSAT Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Yices 2.6.2n 0 62 6.036 6.4162062540 0
Yices2 0 62 6.67 7.03762062540 0
Yices2-fixedn 0 62 6.705 7.17462062540 0
MathSAT5n 0 62 30.381 30.38862062540 0
z3n 0 62 34.083 34.09262062540 0
SMTInterpol 0 62 535.342 363.6562062540 0
SMTInterpol-fixedn 0 62 539.532 364.9762062540 0
CVC4 0 61 2951.635 2951.9661061551 0
Alt-Ergo 0 52 15615.287 12913.575520526451 0
veriT 0 6 2.061 2.16061100 0

24 seconds Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2019-Yices 2.6.2n 0 116 56.223 55.591116546200 0
Yices2 0 116 69.148 68.544116546200 0
Yices2-fixedn 0 116 68.043 68.548116546200 0
MathSAT5n 0 115 98.241 98.257115536211 0
SMTInterpol 0 110 769.866 395.522110535766 0
SMTInterpol-fixedn 0 110 796.708 397.289110535766 0
z3n 0 107 425.375 425.413107456299 0
CVC4 0 95 617.103 613.8859541542121 0
Alt-Ergo 0 48 2152.706 1716.808480486865 0
veriT 0 6 4.564 4.6056061100 0

n Non-competing.