SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2015

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LRA (Main Track)

Competition results for the QF_LRA division as of Fri Oct 30 12:49:29 GMT

Competition benchmarks = 1626
Competition industrial benchmarks = 1624

The winners for this division are:

Sequential Performance

Sequential Performance Sequential Performance (industrial) Parallel Performance Parallel Performance (industrial)
CVC4 CVC4 CVC4 CVC4

Division COMPLETE

Sequential Performance

SolverErrors Corrects CPU
CVC4 0 1608 144170.48
CVC4 (exp) 0 1608 144212.59
SMT-RAT 0 1283 920736.25
SMTInterpol 0 1303 804243.06
Yices 0 1592 128759.73
MathSatn 0 1565 263060.85
z3n 0 1472 504249.04
veriT 0 1559 245574.68

Sequential Performance (industrial)

SolverErrors Corrects CPU
CVC4 0 1606 144170.46
CVC4 (exp) 0 1606 144212.57
SMT-RAT 0 1281 920736.23
SMTInterpol 0 1301 804242.23
Yices 0 1590 128759.73
MathSatn 0 1563 263060.80
z3n 0 1470 504248.98
veriT 0 1557 245574.68

Parallel Performance

SolverErrors Corrects CPU WALL
CVC4 0 1608 144185.29 144120.79
CVC4 (exp) 0 1608 144228.04 144186.46
SMT-RAT 0 1283 921050.86 920669.97
SMTInterpol 0 1304 853583.69 797076.90
Yices 0 1592 128791.91 128747.91
MathSatn 0 1565 263114.73 263026.91
z3n 0 1472 504390.71 504204.66
veriT 0 1559 245632.88 245557.64

Parallel Performance (industrial)

SolverErrors Corrects CPU WALL
CVC4 0 1606 144185.27 144120.76
CVC4 (exp) 0 1606 144228.02 144186.44
SMT-RAT 0 1281 921050.84 920669.95
SMTInterpol 0 1302 853582.86 797076.25
Yices 0 1590 128791.91 128747.89
MathSatn 0 1563 263114.68 263026.86
z3n 0 1470 504390.65 504204.60
veriT 0 1557 245632.88 245557.62

Other Information

Solver Not Solved Remaining
CVC4 18 0
CVC4 (exp) 18 0
SMT-RAT 343 0
SMTInterpol 322 0
Yices 34 0
MathSatn 61 0
z3n 154 0
veriT 67 0

n. Non-competitive.