SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_UFLRA (Unsat Core Track)

Competition results for the QF_UFLRA division in the Unsat Core Track.

Page generated on 2020-07-04 11:49:33 +0000

Benchmarks: 101
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
CVC4-ucCVC4-uc

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
z3n 0 60 67.638 67.6740 0
MathSAT5n 0 60 115.612 115.7580 0
CVC4-uc 0 59 258.125 258.3580 0
Yices2 0 58 20.992 21.8090 0
Yices2-fixedn 0 58 26.639 21.1280 0
SMTInterpol-fixedn 0 58 1130.174 868.7860 0
SMTInterpol 0 58 1130.381 878.4160 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
z3n 0 6067.63867.6740 0
MathSAT5n 0 60115.612115.7580 0
CVC4-uc 0 59258.125258.3580 0
Yices2-fixedn 0 5826.63921.1280 0
Yices2 0 5820.99221.8090 0
SMTInterpol-fixedn 0 581130.174868.7860 0
SMTInterpol 0 581130.381878.4160 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.