SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_UF (Model Validation Track)

Competition results for the QF_UF logic in the Model Validation Track.

Page generated on 2022-08-10 11:19:11 +0000

Benchmarks: 1571
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
Yices2Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
Yices2 0 1571 69.015 74.4250 0
2021-Yices2 model-validationn 0 1571 70.273 72.5630 0
z3-4.8.17n 0 1571 197.3 178.3910 0
cvc5 0 1571 797.319 801.2650 0
smtinterpol 0 1571 4560.974 2000.3020 0
OpenSMT 0 1551 297.822 305.3190 0
MathSATn 0 636 230.228 226.8710 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
2021-Yices2 model-validationn 0 157170.27372.5630 0
Yices2 0 157169.01574.4250 0
z3-4.8.17n 0 1571197.3178.3910 0
cvc5 0 1571797.319801.2650 0
smtinterpol 0 15714560.9742000.3020 0
OpenSMT 0 1551297.822305.3190 0
MathSATn 0 636230.228226.8710 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.