SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_UFLIA (Unsat Core Track)

Competition results for the QF_UFLIA logic in the Unsat Core Track.

Page generated on 2022-08-10 11:18:51 +0000

Benchmarks: 16
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
Yices2Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
Yices2 0 21 0.069 0.2040 0
MathSATn 0 21 0.277 0.2770 0
2021-MathSAT5n 0 21 0.312 0.3280 0
cvc5 0 21 0.318 0.310 0
z3-4.8.17n 0 21 0.423 0.4020 0
smtinterpol 0 21 9.681 6.6090 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
Yices2 0 210.0690.2040 0
MathSATn 0 210.2770.2770 0
cvc5 0 210.3180.310 0
2021-MathSAT5n 0 210.3120.3280 0
z3-4.8.17n 0 210.4230.4020 0
smtinterpol 0 219.6816.6090 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.