SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

UF (Main Track)

Competition results for the UF division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 2839
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 2796.487 118.350
vampire_smt_4.1 0.000 2743.824 46.659
vampire_smt_4.1_parallel 0.000 2750.240 89.336
veriT-dev 0.000 1949.835 695.077
z3n 0.000 1848.990 624.518

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 2796.487 118.350 123.442 49
vampire_smt_4.1 0.000 2743.824 46.659 46.304 99
vampire_smt_4.1_parallel 0.000 2756.832 124.536 34.897 85
veriT-dev 0.000 1949.835 696.434 695.071 939
z3n 0.000 1848.990 624.786 624.483 1030

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.