SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2015

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_ALIA (Main Track)

Competition results for the QF_ALIA division as of Fri Oct 30 12:49:29 GMT

Competition benchmarks = 134
Competition industrial benchmarks = 103

The winners for this division are:

Sequential Performance

Sequential Performance Sequential Performance (industrial) Parallel Performance Parallel Performance (industrial)
Yices Yices Yices Yices

Division COMPLETE

Sequential Performance

SolverErrors Corrects CPU
CVC4 0 119 47922.86
CVC4 (exp) 0 133 4471.07
SMTInterpol 0 134 886.11
Yices 0 134 73.32
MathSatn 0 134 492.54
z3n 0 133 6505.91
veriT 0 16 6.19

Sequential Performance (industrial)

SolverErrors Corrects CPU
CVC4 0 88 47922.44
CVC4 (exp) 0 102 4470.66
SMTInterpol 0 103 862.59
Yices 0 103 73.26
MathSatn 0 103 491.55
z3n 0 102 6504.74
veriT 0 16 6.10

Parallel Performance

SolverErrors Corrects CPU WALL
CVC4 0 119 47936.24 47919.51
CVC4 (exp) 0 133 4471.99 4470.88
SMTInterpol 0 134 886.11 583.65
Yices 0 134 73.32 73.70
MathSatn 0 134 492.54 492.53
z3n 0 133 6506.90 6504.49
veriT 0 16 6.19 6.50

Parallel Performance (industrial)

SolverErrors Corrects CPU WALL
CVC4 0 88 47935.82 47919.03
CVC4 (exp) 0 102 4471.58 4470.39
SMTInterpol 0 103 862.59 567.96
Yices 0 103 73.26 73.44
MathSatn 0 103 491.55 491.51
z3n 0 102 6505.73 6503.31
veriT 0 16 6.10 6.25

Other Information

Solver Not Solved Remaining
CVC4 15 0
CVC4 (exp) 1 0
SMTInterpol 0 0
Yices 0 0
MathSatn 0 0
z3n 1 0
veriT 118 0

n. Non-competitive.