SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LIA (Unknown Benchmarks Track)

Competition results for the QF_LIA division as of Thu Jul 7 07:28:02 GMT

Benchmarks in this division : 302

Non-Competitive division

Result table

Sequential Performance

Solver Solved avg. CPU time avg. WALL time
CVC4 138 102195.63 102137.51
ProB 4 166345.71 166243.99
SMT-RAT 146 93776.27 93725.89
Yices2 111 126312.94 126239.19
MathSat5n 214 60925.09 60894.59
SMTInterpol 109 140480.73 124971.12
veriT-dev 108 64405.32 64366.70
z3n 110 116250.41 116191.09

n. Non-competitive.