SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UF (Unsat Core Track)

Competition results for the QF_UF division as of Tue Jul 18 22:06:21 GMT

Benchmarks in this division : 4101
Time Limit: 2400s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time
CVC4 0.000 390954.117 47.119
SMTInterpol 0.000 378321.359 107.550
mathsat-5.4.1n 0.000 376540.075 81.124
z3-4.5.0n 0.000 385039.152 83.100

Parallel Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
CVC4 0.000 390965.935 103.151 103.171
SMTInterpol 0.000 378371.359 697.130 297.279
mathsat-5.4.1n 0.000 376540.075 268.032 268.546
z3-4.5.0n 0.000 385039.152 260.603 261.629

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.