SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LIA (Main Track)

Competition results for the QF_LIA division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division: 6141
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 5630.511 117.012 366
SMTInterpol 0.000 5571.807 137.246 477
SMTRAT 0.000 3232.997 585.656 3746
Yices2 0.000 5588.173 125.937 415
mathsat-5.4.1n 0.000 5691.610 102.361 239
veriT 0.000 2829.462 387.184 3831
z3-4.5.0n 0.000 5498.033 140.220 475

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 5630.511 117.013 117.009 366
SMTInterpol 0.000 5575.117 148.759 130.635 473
SMTRAT 0.000 3232.997 585.667 585.711 3746
Yices2 0.000 5588.173 125.939 126.026 415
mathsat-5.4.1n 0.000 5691.610 102.361 102.400 239
veriT 0.000 2829.462 387.188 387.337 3831
z3-4.5.0n 0.000 5498.033 140.222 140.235 475

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.