SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_UFDTLIRA (Single Query Track)

Competition results for the QF_UFDTLIRA logic in the Single Query Track.

Page generated on 2023-07-06 16:04:54 +0000

Benchmarks: 9
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
cvc5cvc5cvc5 cvc5 cvc5

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
cvc5 0 9 0.2 0.19595400 0
2022-z3-4.8.17n 0 9 0.324 0.30795400 0
SMTInterpol 0 9 5.956 3.82595400 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
cvc5 0 90.20.19595400 0
2022-z3-4.8.17n 0 90.3240.30795400 0
SMTInterpol 0 95.9563.82595400 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
cvc5 0 50.1170.115550040 0
2022-z3-4.8.17n 0 50.1870.177550040 0
SMTInterpol 0 53.1862.034550040 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
cvc5 0 40.0830.081404050 0
2022-z3-4.8.17n 0 40.1370.13404050 0
SMTInterpol 0 42.771.792404050 0

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
cvc5 0 90.20.19595400 0
2022-z3-4.8.17n 0 90.3240.30795400 0
SMTInterpol 0 95.9563.82595400 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.