SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_Bitvec (Cloud Track)

Competition results for the QF_Bitvec division in the Cloud Track.

Page generated on 2021-07-18 17:32:03 +0000

Benchmarks: 16
Time Limit: 1200 seconds
Memory Limit: N/A GB

Logics: This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
STP-CMS-Cloud 0 512489.385231108 0
Par4n 0 218234.03220214014 0
cvc5-gg 0 019200.000016016 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
STP-CMS-Cloud 0 21599.152202128 0
cvc5-gg 0 04800.000041216 0
Par4n 0 04800.000041214 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
STP-CMS-Cloud 0 37290.23303588 0
Par4n 0 28634.0322026814 0
cvc5-gg 0 09600.00008816 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
STP-CMS-Cloud 0 1290.7710115012 0
cvc5-gg 0 0384.000016016 0
Par4n 0 0384.000016016 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.