SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_LRA (Cloud Track)

Competition results for the QF_LRA logic in the Cloud Track.

Page generated on 2021-07-18 17:32:03 +0000

Benchmarks: 16
Time Limit: 1200 seconds
Memory Limit: N/A GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTS portfolio 0 127828.708125744 0
SMTS cube-and-conquer 0 128513.236125744 0
Par4n 0 119415.435116555 0
cvc5-gg 0 416591.2254131212 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
Par4n 0 63936.294660285 0
SMTS portfolio 0 55581.365550384 0
SMTS cube-and-conquer 0 55683.292550384 0
cvc5-gg 0 18941.4371107812 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
SMTS portfolio 0 71047.343707094 0
SMTS cube-and-conquer 0 71629.944707094 0
Par4n 0 54279.141505295 0
cvc5-gg 0 36449.7883034912 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTS cube-and-conquer 0 4295.2254311212 0
SMTS portfolio 0 4300.7034311212 0
Par4n 0 4315.2634311212 0
cvc5-gg 0 1369.6041011515 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.