SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

LRA (Cloud Track)

Competition results for the LRA logic in the Cloud Track.

Page generated on 2022-08-10 14:49:53 +0000

Benchmarks: 37
Time Limit: 1200 seconds
Memory Limit: N/A GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Vampire 8 1332649.46613013270 0
cvc5-cloud 20 048000.0000400 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
cvc5-cloud 1 01200.00001360 0
Vampire 1 01200.00001360 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
Vampire 7 139849.466130138160 0
cvc5-cloud 19 025200.000021160 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
Vampire 0 12884.864120122828 0
cvc5-cloud 0 0960.00004040 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.