SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2019

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_SLIA (Single Query Track)

Competition results for the QF_SLIA division in the Single Query Track.

Page generated on 2019-07-23 17:57:14 +0000

Benchmarks: 23175
Time Limit: 2400 seconds
Memory Limit: 60 GB

This division is experimental. Solvers are only ranked by performance, but no winner is selected.

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
CVC4 0 23102 431446.836 433519.411231021862444787373 0
2018-CVC4n 0 22641 1484330.321 1485372.22822641181814460534533 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
CVC4 0 23102431823.756433516.171231021862444787373 0
2018-CVC4n 0 226411485225.3511485348.29822641181814460534533 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
CVC4 0 18624236998.87238376.28518624186240455173 0
2018-CVC4n 0 181811264224.6021264345.076181811818104994533 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
CVC4 0 447834024.88734339.8864478044781869773 0
2018-CVC4n 0 446060200.74960203.22344600446018715533 0

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
CVC4 0 2246325644.8225572.80822463180454418712712 0
2018-CVC4n 0 2214537021.92336927.9192214517705444010301030 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.