SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LRA (Main Track)

Competition results for the QF_LRA division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division: 1649
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 1594.975 66.330 72
SMTInterpol 0.000 1551.026 99.535 120
SMTRAT 0.000 1313.411 262.421 659
Yices2 0.000 1578.694 67.052 87
mathsat-5.4.1n 0.000 1520.481 112.649 191
opensmt2 1.438 1494.792 139.249 357
veriT 0.000 1548.154 93.142 138
z3-4.5.0n 0.000 1498.182 130.651 262

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 1594.975 66.330 66.516 72
SMTInterpol 0.000 1552.852 107.530 94.385 115
SMTRAT 0.000 1313.411 262.425 262.451 659
Yices2 0.000 1578.694 67.053 67.052 87
mathsat-5.4.1n 0.000 1520.481 112.650 112.709 191
opensmt2 1.438 1494.792 139.249 139.368 357
veriT 0.000 1548.154 93.142 93.165 138
z3-4.5.0n 0.000 1498.182 130.653 130.678 262

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.