SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_LRA (Cloud Track)

Competition results for the QF_LRA logic in the Cloud Track.

Page generated on 2022-08-10 14:49:53 +0000

Benchmarks: 18
Time Limit: 1200 seconds
Memory Limit: N/A GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTS portfolio 1 164367.121651130 0
SMTS cube-and-conquer 3 146302.1341431150 0
SMTS cube-and-conquer (fixed) 4 143233.6291421250 0
cvc5-cloud 10 418904.175422150 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
SMTS portfolio 1 51776.0155501120 0
SMTS cube-and-conquer (fixed) 2 22338.1122204120 0
SMTS cube-and-conquer 3 33578.4983303120 0
cvc5-cloud 4 25615.682204120 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
SMTS portfolio 0 111602.53311011160 0
SMTS cube-and-conquer 0 111739.06911011160 0
SMTS cube-and-conquer (fixed) 2 12462.42512012060 0
cvc5-cloud 6 212088.4952021060 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTS portfolio 0 7362.1777251212 0
SMTS cube-and-conquer 0 7387.4617161212 0
cvc5-cloud 0 1434.5661011818 0
SMTS cube-and-conquer (fixed) 2 6387.476151311 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.