SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

AUFLIA (Main Track)

Competition results for the AUFLIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 4
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
CVC4CVC4

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
Alt-Ergo 0.000 2.000 600.091 2 2
CVC4 0.000 3.000 298.846 3 1
Vampire 4.3 0.000 2.000 1069.812 2 2
veriT 0.000 2.000 599.887 2 2
z3-4.7.1n 0.000 2.000 599.992 2 2

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
Alt-Ergo 0.0002.0002384.714600.11622
CVC4 0.0003.000298.846300.13031
Vampire 4.3 0.0002.0002856.440718.22922
veriT 0.0002.000599.887600.02422
z3-4.7.1n 0.0002.000599.992600.05222

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.