SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFIDL (Main Track)

Competition results for the QF_UFIDL division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 441
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 415.071 185.215
SMTInterpol 0.000 432.991 64.356
Yices2 0.000 439.018 13.053
veriT-dev 0.000 416.544 108.535
z3n 0.000 439.018 14.374

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 415.071 185.297 185.189 29
SMTInterpol 0.000 432.991 65.093 60.027 19
Yices2 0.000 439.018 13.060 13.050 2
veriT-dev 0.000 416.544 108.581 108.456 30
z3n 0.000 439.018 14.381 14.365 2

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.