SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2014

Rules
Benchmarks
Tools
Specs
Participants
Results
Report

QF_LIA (Application Track)

Competition results for the QF_LIA division as of Jul 17 2014
Competition benchmarks = 65 (check-sat commands = 19689957)

Division COMPLETE: The winner is Yices2

Parallel Performance

Solver Errors Solved Not Solved CPU Time
Z3n 0 19689683 274 58228.50
MathSATn 0 17619155 2070802 57933.90
SMTInterpol 0 19689059 898 69734.60
CVC4 0 12648373 7041584 96877.20
Yices2 0 19689907 50 36659.10

n. Non-competitive.