SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LIA (Main Track)

Competition results for the QF_LIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 5839
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 5510.092 167.597
MathSat5n 0.000 5561.147 138.080
ProB 0.000 1178.476 1507.090
SMT-RAT 0.000 3223.364 1098.452
SMTInterpol 0.000 5456.162 198.425
Yices2 0.000 5465.835 180.926
veriT-dev 0.000 2854.377 709.000
z3n 0.000 5444.101 187.484

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 5510.092 167.669 167.493 168
MathSat5n 0.000 5561.147 138.144 138.040 120
ProB 0.000 1178.476 1508.020 1507.096 5148
SMT-RAT 0.000 3223.814 1099.095 1098.449 3450
SMTInterpol 0.000 5456.666 211.890 190.450 251
Yices2 0.000 5465.835 181.015 180.885 206
veriT-dev 0.000 2854.377 709.385 708.966 3578
z3n 0.000 5444.101 187.577 187.420 254

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.