SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_Bitvec (Unsat Core Track)

Competition results for the QF_Bitvec division in the Unsat Core Track.

Page generated on 2022-08-10 11:18:51 +0000

Benchmarks: 2887
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
BitwuzlaBitwuzla

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
Bitwuzla 0 2301687 247133.739 246989.583139 0
2020-Bitwuzla-fixedn 0 2217052 240620.225 240479.946134 0
Yices2 0 1905566 350259.717 350163.36264 0
z3-4.8.17n 0 1633136 1321726.085 1321888.4851010 0
cvc5 0 422993 2494764.535 2494983.91998 0
MathSATn 0 0 345.647 346.0120 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
Bitwuzla 0 2301687247155.949246984.343139 0
2020-Bitwuzla-fixedn 0 2217052240639.985240475.036134 0
Yices2 0 1905566350303.297350151.92264 0
z3-4.8.17n 0 16331361321872.3151321846.2651010 0
cvc5 0 4229932494994.7652494910.931998 0
MathSATn 0 0345.647346.0120 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.