SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_Bitvec (Unsat Core Track)

Competition results for the QF_Bitvec division in the Unsat Core Track.

Page generated on 2023-07-06 16:05:43 +0000

Benchmarks: 2949
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
Yices2Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
2022-Bitwuzlan 0 2645285 93867.102 93676.365173 0
Bitwuzla Fixedn 0 2626328 88412.66 88387.745122 0
Yices2 0 2308469 42822.945 42823.097332 0
cvc5 0 688575 229488.093 229361.081595 0
Bitwuzla 30* 2540736 96843.954 96697.565349 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
2022-Bitwuzlan 0 264528593867.10293676.365173 0
Bitwuzla Fixedn 0 262632888412.6688387.745122 0
Yices2 0 230846942822.94542823.097332 0
cvc5 0 688575229488.093229361.081595 0
Bitwuzla 30* 254073696843.95496697.565349 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.
* The error score is caused by Bitwuzla using the wrong names in the unsat core output (syntactic problems). It does not indicate an unsoundness.