SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LIA (Unsat Core Track)

Competition results for the QF_LIA division as of Wed Jun 29 20:25:52 GMT

Benchmarks in this division : 2840

Non-Competitive division

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
MathSat5n 0.000 1673480.940 169.576 169.531
SMTInterpol 0.000 1675726.324 107.228 98.965
veriTn 1.809 0.000 85.556 85.552
z3n 0.000 1675765.470 201.242 201.044

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.