SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFLIA (Main Track)

Competition results for the QF_UFLIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 598
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 598.000 1.227
MathSat5n 0.000 598.000 1.859
SMTInterpol 0.000 598.000 3.730
Yices2 0.000 598.000 0.235
veriT-dev 0.000 554.478 135.664
z3n 0.000 598.000 0.168

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 598.000 1.227 1.222 0
MathSat5n 0.000 598.000 1.859 1.850 0
SMTInterpol 0.000 598.000 3.730 2.147 0
Yices2 0.000 598.000 0.235 0.172 0
veriT-dev 0.000 554.478 135.691 135.612 50
z3n 0.000 598.000 0.168 0.166 0

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.