SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_Bitvec (Parallel Track)

Competition results for the QF_Bitvec division in the Parallel Track.

Page generated on 2021-07-18 17:32:11 +0000

Benchmarks: 17
Time Limit: 1200 seconds
Memory Limit: N/A GB

Logics: This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Par4n 0 219098.11820215015 0
STP-parallel 0 116800.0110116014 0
cvc5-gg 0 019200.000017016 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
STP-parallel 0 02400.000041314 0
cvc5-gg 0 04800.000041316 0
Par4n 0 04800.000041315 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Par4n 0 29498.1182027815 0
STP-parallel 0 19600.011018814 0
cvc5-gg 0 010800.00009816 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
STP-parallel 0 1336.0110116014 0
cvc5-gg 0 0384.000017016 0
Par4n 0 0408.000017017 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.