SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_UFLIA (Model Validation Track)

Competition results for the QF_UFLIA logic in the Model Validation Track.

Page generated on 2023-07-06 16:06:00 +0000

Benchmarks: 330
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
SMTInterpolSMTInterpol

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
SMTInterpol 0 330 7136.322 5474.380 0
2022-smtinterpoln 0 329 6103.116 4475.361 0
Yices2 0 320 2884.517 2886.38810 0
cvc5 0 319 2125.748 2126.01811 0
OpenSMT 0 316 2808.713 2802.9614 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
SMTInterpol 0 3307136.3225474.380 0
2022-smtinterpoln 0 3307319.0565618.280 0
Yices2 0 3202884.5172886.38810 0
cvc5 0 3192125.7482126.01811 0
OpenSMT 0 3162808.7132802.9614 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.