SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_ALIA (Unsat Core Track)

Competition results for the QF_ALIA logic in the Unsat Core Track.

Page generated on 2022-08-10 11:18:51 +0000

Benchmarks: 30
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
smtinterpolsmtinterpol

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
z3-4.8.17n 0 654 0.936 0.8880 0
smtinterpol 0 633 32.659 17.0410 0
Yices2 0 594 0.156 0.3790 0
MathSATn 0 553 0.769 0.7770 0
2021-MathSAT5n 0 553 0.792 0.7960 0
cvc5 0 0 2.451 2.4370 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
z3-4.8.17n 0 6540.9360.8880 0
smtinterpol 0 63332.65917.0410 0
Yices2 0 5940.1560.3790 0
MathSATn 0 5530.7690.7770 0
2021-MathSAT5n 0 5530.7920.7960 0
cvc5 0 02.4512.4370 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.