SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

UFLIA (Cloud Track)

Competition results for the UFLIA logic in the Cloud Track.

Page generated on 2022-08-10 14:49:53 +0000

Benchmarks: 19
Time Limit: 1200 seconds
Memory Limit: N/A GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Vampire 2 616333.651606130 0
cvc5-cloud 3 022800.0000190 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
cvc5-cloud 0 00.00000190 0
Vampire 0 00.00000190 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
Vampire 2 63133.6516062110 0
cvc5-cloud 3 09600.00008110 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Vampire 0 2442.3112021717 0
cvc5-cloud 0 0456.00001919 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.