SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

LIA (Main Track)

Competition results for the LIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division : 201

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 201.000 0.012
ProB 0.000 67.923 259.098
vampire_smt_4.1 0.000 196.398 26.894
vampire_smt_4.1_parallel 0.000 197.119 46.504
veriT-dev 0.000 140.839 69.528
z3n 0.000 201.000 0.039

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 201.000 0.012 0.013 0
ProB 0.000 67.923 259.252 259.100 166
vampire_smt_4.1 0.000 196.398 26.894 26.764 3
vampire_smt_4.1_parallel 0.000 197.119 77.406 19.800 2
veriT-dev 0.000 140.839 69.569 69.531 31
z3n 0.000 201.000 0.039 0.037 0

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.