SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2014

Rules
Benchmarks
Tools
Specs
Participants
Results
Report

QF_LRA (Application Track)

Competition results for the QF_LRA division as of Jul 17 2014
Competition benchmarks = 10 (check-sat commands = 795)

Division COMPLETE: The winner is SMTInterpol

Parallel Performance

Solver Errors Solved Not Solved CPU Time
Z3n 0 728 67 17396.80
MathSATn 0 793 2 7568.40
SMTInterpol 0 746 49 11414.90
CVC4 0 651 144 19516.40
Yices2 0 742 53 9589.42

n. Non-competitive.