SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_UF (Unsat Core Track)

Competition results for the QF_UF division in the Unsat Core Track.

Page generated on 2020-07-04 11:49:33 +0000

Benchmarks: 1644
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
SMTInterpolSMTInterpol

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
SMTInterpol-fixedn 0 215386 11536.215 6655.9050 0
SMTInterpol 0 215366 11540.921 6636.5590 0
z3n 0 215351 3394.389 3381.790 0
Yices2-fixedn 0 213972 2044.95 2045.2340 0
Yices2 0 213972 2056.252 2063.9550 0
CVC4-uc 0 213115 3152.264 3137.160 0
MathSAT5n 8* 193215 1406.564 1353.1520 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
SMTInterpol-fixedn 0 21538611536.2156655.9050 0
SMTInterpol 0 21536611540.9216636.5590 0
z3n 0 2153513394.3893381.790 0
Yices2-fixedn 0 2139722044.952045.2340 0
Yices2 0 2139722056.2522063.9550 0
CVC4-uc 0 2131153152.2643137.160 0
MathSAT5n 8* 1932151406.5641353.1520 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.
* The error score is caused by MathSAT using the wrong names in the unsat core output (syntactic problems). It does not indicate an unsoundness.