SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UF (Main Track)

Competition results for the QF_UF division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division : 6650
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 6471.175 53.608 14
SMTInterpol 0.000 5995.422 146.976 85
Yices2 0.000 6650.000 4.593 0
mathsat-5.4.1n 0.000 6034.084 127.036 82
opensmt2 0.000 6451.909 55.229 16
veriT 0.000 6623.937 7.575 2
z3-4.5.0n 0.000 6072.962 118.351 79

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 6471.175 53.608 53.616 14
SMTInterpol 0.000 5995.422 154.709 143.873 85
Yices2 0.000 6650.000 4.593 4.596 0
mathsat-5.4.1n 0.000 6034.084 127.037 127.044 82
opensmt2 0.000 6451.909 55.229 55.253 16
veriT 0.000 6623.937 7.575 7.603 2
z3-4.5.0n 0.000 6072.962 118.352 118.486 79

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.