SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_UFBV (Single Query Track)

Competition results for the QF_UFBV logic in the Single Query Track.

Page generated on 2023-07-06 16:04:54 +0000

Benchmarks: 300
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
BitwuzlaBitwuzlaBitwuzla Bitwuzla Bitwuzla

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2022-Bitwuzlan 0 287 12591.348 12589.7372871781091312 1
Bitwuzla Fixedn 0 284 19914.508 19900.0842841761081615 0
Bitwuzla 0 284 19921.49 19865.3112841761081615 0
Yices2 0 248 34926.401 34908.5248168805252 0
cvc5 0 246 35213.17 35161.613246155915444 9
Z3-Owl Fixedn 0 63 2953.419 2958.47863402323776 0
UltimateIntBlastingWrapper+SMTInterpol 0 8 3972.506 3570.618880292205 1
Z3-Owl 4 217 39842.755 39850.645217144738379 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2022-Bitwuzlan 0 28712591.34812589.7372871781091312 1
Bitwuzla 0 28419921.4919865.3112841761081615 0
Bitwuzla Fixedn 0 28419914.50819900.0842841761081615 0
Yices2 0 24834926.40134908.5248168805252 0
cvc5 0 24635213.1735161.613246155915444 9
Z3-Owl Fixedn 0 632953.4192958.47863402323776 0
UltimateIntBlastingWrapper+SMTInterpol 0 95195.0564733.648990291204 1
Z3-Owl 4 21739842.75539850.645217144738379 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
2022-Bitwuzlan 0 1785450.985447.4051781780511712 1
Bitwuzla 0 17614078.36114079.4571761760711715 0
Bitwuzla Fixedn 0 17614113.21614117.8291761760711715 0
Yices2 0 16812924.51112904.96416816801511752 0
cvc5 0 15514527.53614480.68415515502811744 9
Z3-Owl Fixedn 0 402048.6372051.2914040014311776 0
UltimateIntBlastingWrapper+SMTInterpol 0 95195.0564733.648990174117204 1
Z3-Owl 4 14422507.97522512.33614414403911779 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
2022-Bitwuzlan 0 1097140.3697142.3331090109119012 1
Bitwuzla Fixedn 0 1085801.2925782.2551080108219015 0
Bitwuzla 0 1085843.1295785.8541080108219015 0
cvc5 0 9120685.63320680.929910911919044 9
Yices2 0 8022001.8922003.536800803019052 0
Z3-Owl 0 7317334.7817338.309730733719079 0
Z3-Owl Fixedn 0 23904.782907.187230238719076 0
UltimateIntBlastingWrapper+SMTInterpol 0 00.00.0000110190204 1

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2022-Bitwuzlan 0 192759.751755.30419213062108107 1
Bitwuzla 0 181781.914780.32518111764119119 0
Bitwuzla Fixedn 0 180776.284756.10118011763120120 0
Yices2 0 178391.875370.63417813939122122 0
cvc5 0 134894.317871.30813410331166157 9
Z3-Owl Fixedn 0 55154.216155.587553619245180 0
UltimateIntBlastingWrapper+SMTInterpol 0 116.3315.477110299262 1
Z3-Owl 1 125646.21644.291259233175174 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.