SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LRA (Unsat Core Track)

Competition results for the QF_LRA division as of Tue Jul 18 22:06:21 GMT

Benchmarks in this division: 671
Time Limit: 2400s

Winners

Sequential Performance Parallel Performance
SMTInterpol SMTInterpol

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time
CVC4 0.000 0.000 0.047
SMTInterpol 0.000 133248.404 73.461
mathsat-5.4.1n 0.000 96826.816 76.722
z3-4.5.0n 0.000 112151.570 50.435

Parallel Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
CVC4 0.000 0.000 0.047 0.049
SMTInterpol 0.000 136931.730 235.774 150.261
mathsat-5.4.1n 0.000 108799.538 232.904 234.642
z3-4.5.0n 0.000 134018.215 128.677 128.584

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.