SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_ABV (Unsat Core Track)

Competition results for the QF_ABV logic in the Unsat Core Track.

Page generated on 2022-08-10 11:18:51 +0000

Benchmarks: 1865
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
BitwuzlaBitwuzla

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
Bitwuzla 0 190264 22725.825 22707.20510 0
2021-Bitwuzlan 0 187186 23813.912 23713.08612 0
Yices2 0 180360 24959.114 24933.18614 0
z3-4.8.17n 0 172030 35685.268 35639.21624 0
cvc5 0 127243 29926.816 29899.52819 0
MathSATn 0 73 46.919 47.910 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
Bitwuzla 0 19026422728.89522706.86510 0
2021-Bitwuzlan 0 18718623816.53223712.36612 0
Yices2 0 18036024963.06424932.59614 0
z3-4.8.17n 0 17203035692.47835638.15624 0
cvc5 0 12724329931.25629898.76819 0
MathSATn 0 7346.91947.910 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.