SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_UFLIA (Single Query Track)

Competition results for the QF_UFLIA division in the Single Query Track.

Page generated on 2020-07-04 11:46:59 +0000

Benchmarks: 300
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
Yices2Yices2Yices2 Yices2 Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2018-Yicesn 0 300 52.376 53.5453002316900 0
Yices2-fixedn 0 300 66.073 56.2883002316900 0
Yices2 0 300 73.354 55.6363002316900 0
z3n 0 300 147.535 147.5923002316900 0
CVC4 0 300 565.883 565.8663002316900 0
SMTInterpol 0 300 2423.497 1764.2783002316900 0
SMTInterpol-fixedn 0 300 2434.247 1811.6163002316900 0
MathSAT5n 0 298 4640.648 4641.6922982306822 0
veriT 0 273 25859.468 25839.29927320667277 0
Alt-Ergo 0 64 92381.571 87739.1736406423660 0

Parallel Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2018-Yicesn 0 300 52.376 53.5453002316900 0
Yices2 0 300 73.354 55.6363002316900 0
Yices2-fixedn 0 300 66.073 56.2883002316900 0
z3n 0 300 147.535 147.5923002316900 0
CVC4 0 300 565.883 565.8663002316900 0
SMTInterpol 0 300 2423.497 1764.2783002316900 0
SMTInterpol-fixedn 0 300 2434.247 1811.6163002316900 0
MathSAT5n 0 298 4640.978 4641.6122982306822 0
veriT 0 273 25860.048 25839.02927320667277 0
Alt-Ergo 0 65 102569.951 86021.3286506523557 0

SAT Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Yices2 0 231 4.595 5.252312310690 0
Yices2-fixedn 0 231 4.567 5.842312310690 0
2018-Yicesn 0 231 5.008 5.8922312310690 0
z3n 0 231 30.12 30.1492312310690 0
CVC4 0 231 135.741 135.7052312310690 0
SMTInterpol 0 231 1776.429 1303.9392312310690 0
SMTInterpol-fixedn 0 231 1786.932 1337.7292312310690 0
MathSAT5n 0 230 1694.944 1695.0092302300702 0
veriT 0 206 21291.699 21292.9562062060947 0
Alt-Ergo 0 0 95344.067 80602.21400030057 0

UNSAT Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2018-Yicesn 0 69 47.368 47.654690692310 0
Yices2 0 69 68.758 50.386690692310 0
Yices2-fixedn 0 69 61.506 50.448690692310 0
z3n 0 69 117.414 117.443690692310 0
CVC4 0 69 430.142 430.161690692310 0
SMTInterpol 0 69 647.068 460.339690692310 0
SMTInterpol-fixedn 0 69 647.315 473.888690692310 0
MathSAT5n 0 68 2946.034 2946.602680682322 0
veriT 0 67 4568.349 4546.073670672337 0
Alt-Ergo 0 65 7225.883 5419.1156506523557 0

24 seconds Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2018-Yicesn 0 300 52.376 53.5453002316900 0
Yices2 0 300 73.354 55.6363002316900 0
Yices2-fixedn 0 300 66.073 56.2883002316900 0
z3n 0 299 94.222 94.2692992316811 0
CVC4 0 295 345.45 345.4042952316455 0
SMTInterpol-fixedn 0 294 1032.573 506.9672942286666 0
SMTInterpol 0 294 1086.366 527.0952942286666 0
MathSAT5n 0 289 402.741 402.785289224651111 0
veriT 0 202 3079.26 3056.821202145579888 0
Alt-Ergo 0 62 5280.269 3526.11862062238117 0

n Non-competing.