SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LRA (Application Track)

Competition results for the QF_LRA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division : 10

Winner : SMTInterpol

Result table1

Parallel Performance

Solver Error Score Correct Score avg. CPU time avg. WALL time
CVC4n 0 666 19103.84 19092.12
Yices2 0 741 9649.02 9643.03
MathSat5n 0 795 1733.19 1731.39
SMTInterpol 0 755 13463.70 12147.38
z3n 0 744 16908.14 16898.90

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.