SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_IDL (Main Track)

Competition results for the QF_IDL division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division : 2094

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 1877.827 341.976
SMTInterpol 0.000 1647.765 584.632
Yices2 0.000 1941.895 214.185
veriT-dev 0.000 1741.542 503.330
z3n 0.000 1956.349 238.988

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 1877.827 342.109 341.785 256
SMTInterpol 0.000 1647.765 626.648 576.564 460
Yices2 0.000 1941.895 214.289 214.124 146
veriT-dev 0.000 1741.542 503.572 503.256 451
z3n 0.000 1956.349 239.080 238.935 130

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.