SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

UFLIA (Unsat Core Track)

Competition results for the UFLIA logic in the Unsat Core Track.

Page generated on 2022-08-10 11:18:51 +0000

Benchmarks: 3913
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
cvc5Vampire

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
2021-cvc5-ucn 0 929204 200406.758 213583.566162 0
z3-4.8.17n 0 878902 308001.18 308059.242195 3
cvc5 0 850989 237645.49 237600.124180 0
Vampire 0 846167 316213.622 254202.467194 0
smtinterpol 0 593233 1715140.422 1704824.4651386 0
UltimateEliminator+MathSAT 0 1645 34040.03 25386.7368 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
2021-cvc5-ucn 0 929204212321.828213572.426162 0
z3-4.8.17n 0 878902308042.41308050.542195 3
Vampire 0 857713409275.392211098.69127 0
cvc5 0 850989237682.46237592.074180 0
smtinterpol 0 5939851737156.6521700466.4471370 0
UltimateEliminator+MathSAT 0 164534040.0325386.7368 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.