SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LIA (Unsat Core Track)

Competition results for the QF_LIA division as of Tue Jul 18 22:06:21 GMT

Benchmarks in this division: 2844
Time Limit: 2400s

Winners

Sequential Performance Parallel Performance
SMTInterpol SMTInterpol

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time
CVC4 0.000 0.000 0.117
SMTInterpol 0.000 1661748.088 44.808
mathsat-5.4.1n 0.000 1660217.028 55.112
z3-4.5.0n 0.000 1661127.381 78.282

Parallel Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
CVC4 0.000 0.000 0.117 0.120
SMTInterpol 0.000 1661748.088 199.559 122.495
mathsat-5.4.1n 0.000 1661007.166 185.351 186.239
z3-4.5.0n 0.000 1662034.097 219.712 221.194

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.