SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2019

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_DT (Single Query Track)

Competition results for the QF_DT division in the Single Query Track.

Page generated on 2019-07-23 17:56:09 +0000

Benchmarks: 4
Time Limit: 2400 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
CVC4CVC4CVC4 CVC4 CVC4

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2018-CVC4n 0 4 26.405 26.40443100 0
CVC4 0 4 26.555 26.5643100 0
Alt-Ergo 0 1 926.995 234.05510130 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2018-CVC4n 0 426.40526.40443100 0
CVC4 0 426.55526.5643100 0
Alt-Ergo 0 1926.995234.05510130 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2018-CVC4n 0 30.0470.04733010 0
CVC4 0 30.0540.05333010 0
Alt-Ergo 0 00.2550.11800040 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2018-CVC4n 0 126.35826.35710130 0
CVC4 0 126.50126.50710130 0
Alt-Ergo 0 1926.74233.93710130 0

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedAbstainedTimeout Memout
2018-CVC4n 0 324.04724.04733011 0
CVC4 0 324.05424.05333011 0
Alt-Ergo 0 024.25524.11800041 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.