SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

UF (Main Track)

Competition results for the UF division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 7562
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
CVC4Vampire 4.3

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 4348.029 545.478 4179 3383
Vampire 4.3 0.000 4090.284 574.918 3932 3630
veriT 0.000 3635.886 622.447 3231 4331
z3-4.7.1n 0.000 3171.495 597.518 2861 4701

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.0004348.029546.027551.91741793383
Vampire 4.3 0.0004366.1982127.260535.00842073355
veriT 0.0003635.886623.287622.52232314331
z3-4.7.1n 0.0003171.495597.520597.61328614701

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.