SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_LIA (Parallel Track)

Competition results for the QF_LIA logic in the Parallel Track.

Page generated on 2022-08-10 14:49:55 +0000

Benchmarks: 26
Time Limit: 1200 seconds
Memory Limit: N/A GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Parallel Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTS cube-and-conquer (fixed) 11 1411059.7614131120 0
SMTS cube-and-conquer 12 1413501.86614140120 0
SMTS portfolio 12 1314524.14413130130 0

SAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
SMTS cube-and-conquer (fixed) 3 137099.81913130760 0
SMTS cube-and-conquer 6 148540.05314140660 0
SMTS portfolio 6 139559.21913130760 0

UNSAT Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
SMTS cube-and-conquer 6 04961.8130006200 0
SMTS portfolio 6 04964.9250006200 0
SMTS cube-and-conquer (fixed) 8 13959.9411015200 0

24 seconds Performance

Solver Error Score Correct ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
SMTS portfolio 0 9514.4159901717 0
SMTS cube-and-conquer 0 6543.446602020 0
SMTS cube-and-conquer (fixed) 0 5547.4055502121 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.