SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_AUFLIA (Main Track)

Competition results for the QF_AUFLIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division : 1009

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 1009.000 4.101
MathSat5n 0.000 1009.000 2.522
SMTInterpol 0.000 1009.000 2.333
Yices2 0.000 1009.000 0.039
veriT-dev 0.000 66.447 0.012
z3n 0.000 1009.000 0.126

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 1009.000 4.101 4.098 0
MathSat5n 0.000 1009.000 2.522 2.520 0
SMTInterpol 0.000 1009.000 2.333 1.390 0
Yices2 0.000 1009.000 0.039 0.043 0
veriT-dev 0.000 66.447 0.012 0.016 994
z3n 0.000 1009.000 0.126 0.124 0

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.