SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Previous Competitions
SMT-LIB

SMT-COMP 2019

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_SLIA (Single Query Track)

Competition results for the QF_SLIA division in the Single Query Track.

Page generated on 2019-07-23 17:57:14 +0000

Benchmarks: 23175
Time Limit: 2400 seconds
Memory Limit: 60 GB

This division is experimental. Solvers are only ranked by performance, but no winner is selected.

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
CVC4 0 23102 431446.836 433519.411231021862444787373 0
2018-CVC4n 0 22641 1484330.321 1485372.22822641181814460534533 0

Parallel Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
CVC4 0 23102 431823.756 433516.171231021862444787373 0
2018-CVC4n 0 22641 1485225.351 1485348.29822641181814460534533 0

SAT Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
CVC4 0 18624 236998.87 238376.28518624186240455173 0
2018-CVC4n 0 18181 1264224.602 1264345.076181811818104994533 0

UNSAT Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
CVC4 0 4478 34024.887 34339.8864478044781869773 0
2018-CVC4n 0 4460 60200.749 60203.22344600446018715533 0

24 seconds Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
CVC4 0 22463 25644.82 25572.80822463180454418712712 0
2018-CVC4n 0 22145 37021.923 36927.9192214517705444010301030 0

n Non-competing.