SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_UFLIA (Single Query Track)

Competition results for the QF_UFLIA division in the Single Query Track.

Page generated on 2020-07-04 11:46:59 +0000

Benchmarks: 300
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
Yices2Yices2Yices2 Yices2 Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2018-Yicesn 0 300 52.376 53.5453002316900 0
Yices2-fixedn 0 300 66.073 56.2883002316900 0
Yices2 0 300 73.354 55.6363002316900 0
z3n 0 300 147.535 147.5923002316900 0
CVC4 0 300 565.883 565.8663002316900 0
SMTInterpol 0 300 2423.497 1764.2783002316900 0
SMTInterpol-fixedn 0 300 2434.247 1811.6163002316900 0
MathSAT5n 0 298 4640.648 4641.6922982306822 0
veriT 0 273 25859.468 25839.29927320667277 0
Alt-Ergo 0 64 92381.571 87739.1736406423660 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2018-Yicesn 0 30052.37653.5453002316900 0
Yices2 0 30073.35455.6363002316900 0
Yices2-fixedn 0 30066.07356.2883002316900 0
z3n 0 300147.535147.5923002316900 0
CVC4 0 300565.883565.8663002316900 0
SMTInterpol 0 3002423.4971764.2783002316900 0
SMTInterpol-fixedn 0 3002434.2471811.6163002316900 0
MathSAT5n 0 2984640.9784641.6122982306822 0
veriT 0 27325860.04825839.02927320667277 0
Alt-Ergo 0 65102569.95186021.3286506523557 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Yices2 0 2314.5955.252312310690 0
Yices2-fixedn 0 2314.5675.842312310690 0
2018-Yicesn 0 2315.0085.8922312310690 0
z3n 0 23130.1230.1492312310690 0
CVC4 0 231135.741135.7052312310690 0
SMTInterpol 0 2311776.4291303.9392312310690 0
SMTInterpol-fixedn 0 2311786.9321337.7292312310690 0
MathSAT5n 0 2301694.9441695.0092302300702 0
veriT 0 20621291.69921292.9562062060947 0
Alt-Ergo 0 095344.06780602.21400030057 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2018-Yicesn 0 6947.36847.654690692310 0
Yices2 0 6968.75850.386690692310 0
Yices2-fixedn 0 6961.50650.448690692310 0
z3n 0 69117.414117.443690692310 0
CVC4 0 69430.142430.161690692310 0
SMTInterpol 0 69647.068460.339690692310 0
SMTInterpol-fixedn 0 69647.315473.888690692310 0
MathSAT5n 0 682946.0342946.602680682322 0
veriT 0 674568.3494546.073670672337 0
Alt-Ergo 0 657225.8835419.1156506523557 0

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2018-Yicesn 0 30052.37653.5453002316900 0
Yices2 0 30073.35455.6363002316900 0
Yices2-fixedn 0 30066.07356.2883002316900 0
z3n 0 29994.22294.2692992316811 0
CVC4 0 295345.45345.4042952316455 0
SMTInterpol-fixedn 0 2941032.573506.9672942286666 0
SMTInterpol 0 2941086.366527.0952942286666 0
MathSAT5n 0 289402.741402.785289224651111 0
veriT 0 2023079.263056.821202145579888 0
Alt-Ergo 0 625280.2693526.11862062238117 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.