SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

UFLIA (Cloud Track)

Competition results for the UFLIA logic in the Cloud Track.

Page generated on 2023-07-06 16:06:19 +0000

Benchmarks: 32
Time Limit: 1200 seconds
Memory Limit: N/A GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Vampire 0 91970.577909230 0
cvc5 0 81968.582817240 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
cvc5 0 14.2561100310 0
Vampire 0 00.00001310 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
Vampire 0 91970.5779095180 0
cvc5 0 71964.3267077180 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
cvc5 0 215.2392113030 0
Vampire 0 00.00003232 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.