SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

ALIA (Main Track)

Competition results for the ALIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 42
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 42.000 0.055
vampire_smt_4.1 0.000 38.000 100.025
vampire_smt_4.1_parallel 0.000 38.000 232.315
veriT-dev 0.000 27.000 857.150
z3n 0.000 42.000 0.045

Parallel Performance

Solver Error Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 42.000 0.055 0.057 0
vampire_smt_4.1 0.000 38.000 100.025 99.566 4
vampire_smt_4.1_parallel 0.000 39.000 359.011 91.960 3
veriT-dev 0.000 27.000 857.611 857.158 15
z3n 0.000 42.000 0.045 0.046 0

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.