SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_LRA (Unsat Core Track)

Competition results for the QF_LRA division in the Unsat Core Track.

Page generated on 2020-07-04 11:49:33 +0000

Benchmarks: 196
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
Yices2Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
Yices2 0 146931 71272.537 71277.72539 0
Yices2-fixedn 0 146931 71327.145 71324.8739 0
CVC4-uc 0 143126 41918.385 41812.58914 0
z3n 0 121290 65700.443 65662.15631 0
MathSAT5n 0 112489 146652.922 146682.987103 0
SMTInterpol-fixedn 0 95849 111449.826 108323.25161 0
SMTInterpol 0 88745 111264.6 108124.0661 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
Yices2 0 14693171274.57771276.60539 0
Yices2-fixedn 0 14693171328.63571323.839 0
CVC4-uc 0 14312641921.09541812.05914 0
z3n 0 12129065702.53365660.97631 0
MathSAT5n 0 112489146673.732146677.817103 0
SMTInterpol-fixedn 0 95904111463.026108293.90160 0
SMTInterpol 0 88800111279.97108095.8160 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.