SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UF (Unsat Core Track)

Competition results for the QF_UF division as of Wed Jun 29 20:25:52 GMT

Benchmarks in this division : 4100

Non-Competitive division

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
MathSat5n 0.000 395226.387 241.556 241.401
SMTInterpol 0.000 383391.949 297.799 284.692
toysmtn 0.000 124086.948 794.605 794.344
veriTn 100.019 0.000 311.513 311.474
z3n 0.000 384861.688 257.374 257.295

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.