SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2015

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_AUFLIA (Main Track)

Competition results for the QF_AUFLIA division as of Fri Oct 30 12:49:29 GMT

Competition benchmarks = 1009
Competition industrial benchmarks = 19

The winners for this division are:

Sequential Performance

Sequential Performance Sequential Performance (industrial) Parallel Performance Parallel Performance (industrial)
Yices Yices Yices Yices

Division COMPLETE

Sequential Performance

SolverErrors Corrects CPU
CVC4 0 1009 1381.95
CVC4 (exp) 0 1009 658.94
SMTInterpol 0 1009 1629.07
Yices 0 1009 38.54
MathSatn 0 1009 546.72
z3n 0 1009 121.79
veriT 0 15 6.25

Sequential Performance (industrial)

SolverErrors Corrects CPU
CVC4 0 19 1191.80
CVC4 (exp) 0 19 370.94
SMTInterpol 0 19 116.69
Yices 0 19 1.99
MathSatn 0 19 418.51
z3n 0 19 1.81
veriT 0 7 0.71

Parallel Performance

SolverErrors Corrects CPU WALL
CVC4 0 1009 1381.95 1384.00
CVC4 (exp) 0 1009 658.94 623.77
SMTInterpol 0 1009 1629.07 920.74
Yices 0 1009 38.54 41.43
MathSatn 0 1009 546.72 547.64
z3n 0 1009 121.79 122.15
veriT 0 15 6.25 10.60

Parallel Performance (industrial)

SolverErrors Corrects CPU WALL
CVC4 0 19 1191.80 1191.54
CVC4 (exp) 0 19 370.94 333.57
SMTInterpol 0 19 116.69 94.68
Yices 0 19 1.99 2.04
MathSatn 0 19 418.51 418.44
z3n 0 19 1.81 1.82
veriT 0 7 0.71 0.76

Other Information

Solver Not Solved Remaining
CVC4 0 0
CVC4 (exp) 0 0
SMTInterpol 0 0
Yices 0 0
MathSatn 0 0
z3n 0 0
veriT 994 0

n. Non-competitive.