SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LRA (Unknown Benchmarks Track)

Competition results for the QF_LRA division as of Thu Jul 7 07:28:02 GMT

Benchmarks in this division: 56
Time Limit: 1200s

Non-Competitive division

Result table

Sequential Performance

Solver Solved avg. CPU time avg. WALL time
CVC4 11 31263.22 31512.16
OpenSMT2 0 33617.17 33602.51
SMT-RAT 0 33619.79 33602.55
Yices2 25 23536.01 23524.05
MathSat5n 1 33176.27 33161.99
SMTInterpol 24 29755.37 24961.66
toysmt 0 33336.43 33323.90
veriT-dev 8 31119.57 31103.73
z3n 0 33617.30 33602.40

n. Non-competitive.