SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Competitions by Year

SMT-COMP 2020

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_DT (Single Query Track)

Competition results for the QF_DT division in the Single Query Track.

Page generated on 2020-07-04 11:46:59 +0000

Benchmarks: 4
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
CVC4CVC4CVC4 CVC4 CVC4

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
z3n 0 4 2.982 2.98343100 0
2018-CVC4n 0 4 25.987 25.99243100 0
CVC4 0 4 60.262 60.26243100 0
Alt-Ergo 0 0 1200.401 1200.14800041 0

Parallel Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
z3n 0 4 2.982 2.98343100 0
2018-CVC4n 0 4 25.987 25.99243100 0
CVC4 0 4 60.262 60.26243100 0
Alt-Ergo 0 0 1200.401 1200.14800041 0

SAT Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
2018-CVC4n 0 3 0.045 0.04433010 0
CVC4 0 3 0.05 0.04933010 0
z3n 0 3 0.136 0.13733010 0
Alt-Ergo 0 0 0.401 0.14800041 0

UNSAT Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
z3n 0 1 2.846 2.84610130 0
2018-CVC4n 0 1 25.942 25.94810130 0
CVC4 0 1 60.212 60.21310130 0
Alt-Ergo 0 0 1200.0 1200.000041 0

24 seconds Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
z3n 0 4 2.982 2.98343100 0
2018-CVC4n 0 3 24.045 24.04433011 0
CVC4 0 3 24.05 24.04933011 0
Alt-Ergo 0 0 24.401 24.14800041 0

n Non-competing.