SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

ALIA (Main Track)

Competition results for the ALIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 42
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
CVC4CVC4

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
Alt-Ergo 0.000 39.000 28.764 39 3
CVC4 0.000 40.000 0.677 40 2
Vampire 4.3 0.000 27.000 495.046 27 15
veriT 0.000 27.000 428.554 27 15
z3-4.7.1n 0.000 42.000 0.056 42 0

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
Alt-Ergo 0.00039.00028.82128.672393
CVC4 0.00040.0000.6770.677402
Vampire 4.3 0.00035.0001473.815370.630357
veriT 0.00027.000428.697428.5992715
z3-4.7.1n 0.00042.0000.0560.056420

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.