SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_ADT+BitVec (Model Validation Track)

Competition results for the QF_ADT+BitVec division in the Model Validation Track.

Page generated on 2023-07-06 16:06:00 +0000

Benchmarks: 5251
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics: This division is experimental. Solvers are only ranked by performance, but no winner is selected.

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
Bitwuzla Fixedn 0 5213 7751.025 7790.8753 0
Bitwuzla 0 5213 7765.231 7812.8313 0
cvc5 0 4839 128277.483 128210.699400 1
Yices2 0 7 2599.256 2599.969 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
Bitwuzla Fixedn 0 52137751.0257790.8753 0
Bitwuzla 0 52137765.2317812.8313 0
cvc5 0 4839128277.483128210.699400 1
Yices2 0 72599.2562599.969 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.