SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_Equality+Bitvec (Unsat Core Track)

Competition results for the QF_Equality+Bitvec division in the Unsat Core Track.

Page generated on 2023-07-06 16:05:43 +0000

Benchmarks: 2226
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
Yices2Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
2022-Bitwuzlan 0 1003528 20233.877 20176.71919 0
Yices2 0 791927 11424.269 11386.01770 0
Bitwuzla Fixedn 0 398437 17963.764 17962.58512 0
cvc5 0 111565 20803.05 20612.89830 0
Bitwuzla 3* 400082 18068.085 18075.75411 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
2022-Bitwuzlan 0 100352820233.87720176.71919 0
Yices2 0 79192711424.26911386.01770 0
Bitwuzla Fixedn 0 39843717963.76417962.58512 0
cvc5 0 11156518403.5118212.72830 0
Bitwuzla 3* 40008218068.08518075.75411 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.
* The error score is caused by Bitwuzla using the wrong names in the unsat core output (syntactic problems). It does not indicate an unsoundness.