SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

UFLIA (Main Track)

Competition results for the UFLIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division : 8404

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 8295.625 36.146
vampire_smt_4.1 0.000 8183.886 40.262
vampire_smt_4.1_parallel 0.000 8170.189 76.653
veriT-dev 0.000 8086.933 81.738
z3n 0.000 7808.738 107.667

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 8295.625 36.162 36.897 95
vampire_smt_4.1 0.000 8183.886 40.262 40.013 291
vampire_smt_4.1_parallel 0.000 8170.717 107.372 30.647 326
veriT-dev 0.000 8086.933 81.890 81.729 349
z3n 0.000 7808.738 107.710 107.770 724

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.