SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_RDL (Main Track)

Competition results for the QF_RDL division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division : 220

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 217.323 75.921
OpenSMT2 0.000 191.323 423.544
SMTInterpol 0.000 209.272 275.990
Yices2 0.000 220.000 17.735
toysmt 0.000 53.154 1523.443
veriT-dev 0.000 217.992 58.782
z3n 0.000 217.323 66.522

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 217.323 75.938 75.896 4
OpenSMT2 0.000 191.323 423.719 423.335 28
SMTInterpol 0.000 209.941 293.425 261.928 14
Yices2 0.000 220.000 17.735 17.713 0
toysmt 0.000 53.154 1523.989 1523.410 172
veriT-dev 0.000 217.992 58.796 58.431 3
z3n 0.000 217.323 66.540 66.474 4

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.