SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LIA (Unsat Core Track)

Competition results for the QF_LIA division as of Thu Jul 12 23:54:00 GMT

Benchmarks in this division : 3019
Time limit: 2400s

Winners

Sequential Performance Parallel Performance
SMTInterpolSMTInterpol

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time
SMTInterpol 0.0001486458.412119.281
Yices 2.6.0 0.0001485793.891180.938
CVC4 0.0000.0000.107
mathsat-5.5.2n 0.0001480678.261279.416
z3-4.7.1n 0.0001453064.304265.185

Parallel Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
SMTInterpol 0.0001486458.412156.355112.551
Yices 2.6.0 0.0001485793.891180.939180.944
CVC4 0.0000.0000.1070.107
mathsat-5.5.2n 0.0001480678.261279.417279.420
z3-4.7.1n 0.0001453064.304265.186265.201

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.