SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

UFLIA (Main Track)

Competition results for the UFLIA division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division : 10136
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

</thead>
Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 8305.221 221.055 2449
vampire 4.2 0.000 8198.361 238.579 2612
veriT 0.000 8059.961 241.214 2619
z3-4.5.0n 0.000 8006.637 224.192 2915

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 8305.221 221.381 223.930 2449
vampire 4.2 0.000 8219.309 753.620 192.031 2577
veriT 0.000 8060.189 241.485 241.261 2618
z3-4.5.0n 0.000 8006.637 224.193 224.434 2915

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.