SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2014

Rules
Benchmarks
Tools
Specs
Participants
Results
Report

Application Track (Summary)

Competition results as of Jul 17 2014
Completed 100.0%

Logic Solvers Benchmarks # check-sat Commands % Complete Order
AUFNIRA 3 165 0 100% CVC4; Z3n; CVC3
QF_AUFLIA 5 72 4699864 100% Yices2; Z3n; SMTInterpol; CVC4; MathSATn
QF_BV 4 18 2141 100% MathSATn; Yices2; CVC4; Z3n
QF_LIA 5 65 19689957 100% Yices2; Z3n; SMTInterpol; MathSATn; CVC4
QF_LRA 5 10 795 100% MathSATn; SMTInterpol; Yices2; Z3n; CVC4
QF_UFLIA 5 905 766079 100% Z3n; CVC4; SMTInterpol; Yices2; MathSATn
QF_UFLRA 5 3333 22066 100% Yices2; Z3n; SMTInterpol; CVC4; MathSATn
UFLRA 3 5358 223820 100% Z3n; CVC3; CVC4

n. Non-competitive.