SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_RDL (Main Track)

Competition results for the QF_RDL division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 255
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 233.075 139.804 216 39
MathSATn 0.000 225.266 173.787 208 47
SMTInterpol 0.000 214.953 281.974 201 54
SMTRAT-Rat 0.000 145.451 567.205 153 102
Yices 2.6.0 0.000 236.448 97.253 222 33
opensmt2 0.000 208.483 279.425 192 63
veriT 0.000 231.288 134.878 214 41
z3-4.7.1n 0.000 231.602 132.138 215 40

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.000233.075139.805139.81821639
MathSATn 0.000225.266173.787173.78820847
SMTInterpol 0.000214.953311.124265.57920154
SMTRAT-Rat 0.000145.451567.210567.234153102
Yices 2.6.0 0.000236.44897.25597.24822233
opensmt2 0.000208.483279.426279.45819263
veriT 0.000231.288134.880134.85421441
z3-4.7.1n 0.000231.602132.140132.03621540

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.