SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LRA (Unsat Core Track)

Competition results for the QF_LRA division as of Wed Jun 29 20:25:52 GMT

Benchmarks in this division: 633
Time Limit: 2400s

Non-Competitive division

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
MathSat5n 0.000 126134.710 80.708 80.639
SMTInterpol 0.000 126143.621 145.540 135.430
toysmtn 0.000 0.000 528.203 528.191
veriTn 5.022 0.000 114.047 114.033
z3n 0.000 117893.068 103.679 103.662

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.