SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_ALIA (Main Track)

Competition results for the QF_ALIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 139
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 137.590 34.567
MathSat5n 0.000 139.000 2.621
SMTInterpol 0.000 139.000 6.508
Yices2 0.000 139.000 0.235
veriT-dev 0.000 20.566 0.047
z3n 0.000 125.676 242.180

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 137.590 34.580 34.563 1
MathSat5n 0.000 139.000 2.621 2.545 0
SMTInterpol 0.000 139.000 6.508 3.910 0
Yices2 0.000 139.000 0.235 0.239 0
veriT-dev 0.000 20.566 0.047 0.048 123
z3n 0.000 125.676 242.302 242.175 6

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.