SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

Bitvec (Cloud Track)

Competition results for the Bitvec division in the Cloud Track.

Page generated on 2021-07-18 17:32:03 +0000

Benchmarks: 16
Time Limit: 1200 seconds
Memory Limit: N/A GB

Logics: This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Par4n 0 614139.14865110010 0
cvc5-gg 0 26004.8622021405 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
Par4n 0 51390.61855001110 0
cvc5-gg 0 02400.00005115 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
cvc5-gg 0 21204.8622023115 0
Par4n 0 15548.5310141110 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
cvc5-gg 0 2124.8622021405 0
Par4n 0 2345.43822014014 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.