SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_Equality+Bitvec (Unsat Core Track)

Competition results for the QF_Equality+Bitvec division in the Unsat Core Track.

Page generated on 2021-07-18 17:31:25 +0000

Benchmarks: 2225
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
BitwuzlaBitwuzla

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
Bitwuzla 0 1150402 53456.162 53471.59821 0
cvc5-uc 0 1070957 90334.968 90335.82253 0
z3n 0 772472 128922.791 129694.88798 0
2020-Yices2n 0 771830 112554.581 112602.36882 0
Yices2 0 771830 112966.489 112908.23682 0
2020-z3n 0 767877 124103.604 124120.65696 0
MathSAT5n 0 35 77.863 79.7560 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
Bitwuzla 0 115040253460.45253470.60821 0
cvc5-uc 0 107095790348.16890333.42253 0
z3n 0 772472129464.921129690.52798 0
2020-Yices2n 0 771830112576.721112598.70882 0
Yices2 0 771830112986.849112904.48682 0
2020-z3n 0 767877124126.074124116.48696 0
MathSAT5n 0 3577.86379.7560 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.