SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

AUFLIA (Main Track)

Competition results for the AUFLIA division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division: 4
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 3.000 312.136 1
vampire 4.2 0.000 2.000 600.100 2
veriT 0.000 2.000 599.960 2
z3-4.5.0n 0.000 2.000 600.016 2

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 3.000 312.136 315.086 1
vampire 4.2 0.000 3.000 1582.293 398.478 1
veriT 0.000 2.000 599.960 600.040 2
z3-4.5.0n 0.000 2.000 600.023 600.053 2

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.