SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

AUFLIA (Cloud Track)

Competition results for the AUFLIA logic in the Cloud Track.

Page generated on 2022-08-10 14:49:53 +0000

Benchmarks: 10
Time Limit: 1200 seconds
Memory Limit: N/A GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Vampire 0 28078.93620280 0
cvc5-cloud 0 012000.0000100 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
cvc5-cloud 0 00.00000100 0
Vampire 0 00.00000100 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
Vampire 0 2126.899202080 0
cvc5-cloud 0 02400.0000280 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Vampire 0 1232.77710199 0
cvc5-cloud 0 0240.00001010 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.