SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UF (Unsat Core Track)

Competition results for the QF_UF division as of Thu Jul 12 23:54:00 GMT

Benchmarks in this division : 4330
Time limit: 2400s

Winners

Sequential Performance Parallel Performance
CVC4CVC4

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time
SMTInterpol 0.0001287820.550280.047
Yices 2.6.0 0.0001265953.002218.564
CVC4 0.0001290154.40995.397
mathsat-5.5.2n 0.0001214148.173233.997
z3-4.7.1n 0.0001288198.458239.699

Parallel Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
SMTInterpol 0.0001287820.550654.588267.643
Yices 2.6.0 0.0001265953.002218.571218.580
CVC4 0.0001290154.40995.39795.347
mathsat-5.5.2n 0.0001214148.173234.002233.994
z3-4.7.1n 0.0001288198.458239.703239.655

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.