SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_UFDTLIA (Single Query Track)

Competition results for the QF_UFDTLIA logic in the Single Query Track.

Page generated on 2023-07-06 16:04:54 +0000

Benchmarks: 76
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
SMTInterpolSMTInterpolSMTInterpol SMTInterpol SMTInterpol

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTInterpol 0 56 11378.151 9420.5795646102020 0
cvc5 0 39 7176.303 7177.471393273737 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTInterpol 0 5611378.1519420.5795646102020 0
cvc5 0 397176.3037177.471393273737 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
SMTInterpol 0 468969.0727372.2454646012920 0
cvc5 0 326014.2286015.11532320152937 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
SMTInterpol 0 102409.082048.3341001016520 0
cvc5 0 71162.0741162.35670746537 0

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTInterpol 0 21532.791187.887211745555 0
cvc5 0 16105.401105.426161336060 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.