SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_LinearIntArith (Cloud Track)

Competition results for the QF_LinearIntArith division in the Cloud Track.

Page generated on 2021-07-18 17:32:03 +0000

Benchmarks: 31
Time Limit: 1200 seconds
Memory Limit: N/A GB

Logics: This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Par4n 0 1130274.421115620020 0
SMTS cube-and-conquer 0 1027656.0461010021021 0
SMTS portfolio 0 927947.42299022022 0
cvc5-gg 0 032400.000031027 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
SMTS cube-and-conquer 0 108456.0461010051621 0
SMTS portfolio 0 98747.42299061622 0
Par4n 0 514410.91550101620 0
cvc5-gg 0 018000.0000151627 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Par4n 0 69863.51160652020 0
cvc5-gg 0 08400.0000112027 0
SMTS cube-and-conquer 0 013200.0000112021 0
SMTS portfolio 0 013200.0000112022 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
SMTS cube-and-conquer 0 2727.45722029029 0
SMTS portfolio 0 1728.5411030030 0
cvc5-gg 0 0648.000031027 0
Par4n 0 0744.000031031 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.