SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_ALIA (Application Track)

Competition results for the QF_ALIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 44
Time Limit: 2400s

Winner : SMTInterpol

Result table1

Parallel Performance

Solver Error Score Correct Score avg. CPU time avg. WALL time
CVC4n 0 352571 28444.33 28394.92
Yices2 0 530341 7524.76 7461.20
MathSat5n 0 490951 14058.03 13978.16
SMTInterpol 0 530398 2205.81 1343.37
z3n 0 530398 710.96 651.95

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.