SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2019

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_ALIA (Single Query Track)

Competition results for the QF_ALIA division in the Single Query Track.

Page generated on 2019-07-23 17:56:09 +0000

Benchmarks: 139
Time Limit: 2400 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
Yices 2.6.2Yices 2.6.2Yices 2.6.2 Yices 2.6.2 Yices 2.6.2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Yices 2.6.2 0 139 57.784 57.948139598000 0
2018-Yicesn 0 139 61.481 61.647139598000 0
SMTInterpol 0 139 849.261 413.676139598000 0
CVC4 0 138 5920.914 5922.027138597911 0
Z3n 0 134 13291.91 13290.119134548055 0
Alt-Ergo 0 70 142021.048 133943.646700706954 0
veriT 0 16 7.645 7.606160161230 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Yices 2.6.2 0 13957.78457.948139598000 0
2018-Yicesn 0 13961.48161.647139598000 0
SMTInterpol 0 139849.261413.676139598000 0
CVC4 0 1385921.1545922.007138597911 0
Z3n 0 13413293.5713289.869134548055 0
Alt-Ergo 0 71154452.518129673.616710716849 0
veriT 0 167.6457.606160161230 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Yices 2.6.2 0 5951.41651.4259590800 0
2018-Yicesn 0 5954.83154.84359590800 0
SMTInterpol 0 59450.646179.27859590800 0
CVC4 0 591940.4081941.10859590801 0
Z3n 0 5413203.6713199.96854540855 0
veriT 0 03.7533.7370001390 0
Alt-Ergo 0 0121389.831105180.36700013949 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Yices 2.6.2 0 806.3676.52880080590 0
2018-Yicesn 0 806.656.80580080590 0
Z3n 0 8089.989.90180080595 0
SMTInterpol 0 80398.615234.39980080590 0
CVC4 0 793980.7463980.89979079601 0
Alt-Ergo 0 7133062.68724493.249710716849 0
veriT 0 163.8923.87160161230 0

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Yices 2.6.2 0 13957.78457.948139598000 0
2018-Yicesn 0 13961.48161.647139598000 0
SMTInterpol 0 136767.209367.851136597733 0
Z3n 0 118666.767663.00711839792121 0
CVC4 0 116634.852634.81911645712323 0
Alt-Ergo 0 662784.921960.728660667369 0
veriT 0 167.6457.606160161230 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.