SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFLRA (Main Track)

Competition results for the QF_UFLRA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 1627
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 1546.178 119.866
MathSat5n 0.000 1546.178 119.401
SMTInterpol 0.000 1546.178 121.513
Yices2 0.000 1610.836 72.875
toysmt 0.000 1063.832 840.585
veriT-dev 0.000 1562.342 116.852
z3n 0.000 1562.342 96.081

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 1546.178 119.917 119.867 5
MathSat5n 0.000 1546.178 119.461 119.394 5
SMTInterpol 0.000 1546.178 123.042 120.215 5
Yices2 0.000 1610.836 72.886 72.853 1
toysmt 0.000 1063.832 840.852 840.530 594
veriT-dev 0.000 1562.342 116.908 116.843 4
z3n 0.000 1562.342 96.134 96.080 4

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.