SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_AUFLIA (Unsat Core Track)

Competition results for the QF_AUFLIA division in the Unsat Core Track.

Page generated on 2020-07-04 11:49:33 +0000

Benchmarks: 300
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
CVC4-ucCVC4-uc

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
CVC4-uc 0 21515 342.685 343.1470 0
z3n 0 21159 18.784 18.8330 0
Yices2-fixedn 0 16040 15.727 17.4620 0
Yices2 0 16040 15.843 18.1020 0
SMTInterpol 0 1265 436.421 196.3320 0
SMTInterpol-fixedn 0 1265 439.067 195.9760 0
MathSAT5n 82* 1282 83.487 83.7910 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
CVC4-uc 0 21515342.685343.1470 0
z3n 0 2115918.78418.8330 0
Yices2-fixedn 0 1604015.72717.4620 0
Yices2 0 1604015.84318.1020 0
SMTInterpol-fixedn 0 1265439.067195.9760 0
SMTInterpol 0 1265436.421196.3320 0
MathSAT5n 82* 128283.48783.7910 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.
* The error score is caused by MathSAT using the wrong names in the unsat core output (syntactic problems). It does not indicate an unsoundness.