SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_LRA (Unsat Core Track)

Competition results for the QF_LRA logic in the Unsat Core Track.

Page generated on 2021-07-18 17:31:25 +0000

Benchmarks: 196
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
Yices2Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
2020-Yices2n 0 146557 68745.629 68695.71337 0
Yices2 0 140760 80304.012 80297.18347 0
cvc5-uc 0 138734 45084.307 45091.44314 0
z3n 0 118077 73539.143 73438.99833 0
MathSAT5n 0 115011 146153.994 146181.835106 0
SMTInterpol 0 93971 107306.53 103552.67252 0
SMTInterpol-remus 0 75322 158746.624 154062.1969 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
2020-Yices2n 0 14655768749.05968694.63337 0
Yices2 0 14076080307.66280295.82347 0
cvc5-uc 0 13873445086.49745090.85314 0
z3n 0 11807773541.29373438.09833 0
MathSAT5n 0 115011146173.094146176.625106 0
SMTInterpol 0 94356107328.87103543.27251 0
SMTInterpol-remus 0 80954158919.954153154.5853 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.