SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UF (Main Track)

Competition results for the QF_UF division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 6649
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 6520.257 75.075
MathSat5n 0.000 6071.140 233.675
OpenSMT2 0.000 6484.768 86.172
SMTInterpol 0.000 6039.324 251.790
Yices2 0.000 6649.000 4.307
toysmt 0.000 4008.335 670.382
veriT-dev 0.000 6622.482 12.196
z3n 0.000 6104.738 215.222

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 6520.257 75.103 75.044 10
MathSat5n 0.000 6071.140 233.800 233.639 79
OpenSMT2 0.000 6484.768 86.206 86.147 13
SMTInterpol 0.000 6039.324 270.245 248.576 81
Yices2 0.000 6649.000 4.307 4.304 0
toysmt 0.000 4008.335 670.700 670.270 2533
veriT-dev 0.000 6622.482 12.202 12.194 2
z3n 0.000 6104.738 215.338 215.205 76

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.