SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

UFLIA (Main Track)

Competition results for the UFLIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 10137
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
CVC4CVC4

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 8254.323 218.360 7677 2460
Vampire 4.3 0.000 7734.412 302.722 6538 3599
veriT 0.000 8059.022 239.786 7478 2659
z3-4.7.1n 0.000 8021.726 224.676 7255 2882

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.0008254.323218.530221.35476772460
Vampire 4.3 0.0008113.0391064.191267.67572552882
veriT 0.0008059.022240.140239.80074782659
z3-4.7.1n 0.0008021.726224.676224.89872552882

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.