SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LRA (Application Track)

Competition results for the QF_LRA division as of Tue Jul 18 22:06:21 GMT

Benchmarks in this division : 10
Time Limit: 2400s

Winner : SMTInterpol

Result table1

Parallel Performance

Solver Error Score Correct Score avg. CPU time avg. WALL time
CVC4 0 693 17965.80 17966.71
SMTInterpol 0 769 9921.74 9046.35
Yices2 0 749 9625.35 9625.58
mathsat-5.4.1n 0 795 2948.68 2947.94
opensmt2 0 0 0.03 0.31
z3-4.5.0n 0 742 17462.12 17463.78

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.