SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_ALIA (Unsat Core Track)

Competition results for the QF_ALIA division in the Unsat Core Track.

Page generated on 2020-07-04 11:49:33 +0000

Benchmarks: 30
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
CVC4-ucCVC4-uc

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
z3n 0 720 1.239 1.2410 0
CVC4-uc 0 677 1.17 1.1820 0
SMTInterpol-fixedn 0 644 21.245 12.5640 0
SMTInterpol 0 644 21.28 12.5820 0
MathSAT5n 0 583 0.789 0.8230 0
Yices2-fixedn 0 564 0.155 0.380 0
Yices2 0 564 0.162 0.5260 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
z3n 0 7201.2391.2410 0
CVC4-uc 0 6771.171.1820 0
SMTInterpol-fixedn 0 64421.24512.5640 0
SMTInterpol 0 64421.2812.5820 0
MathSAT5n 0 5830.7890.8230 0
Yices2-fixedn 0 5640.1550.380 0
Yices2 0 5640.1620.5260 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.