SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFLRA (Main Track)

Competition results for the QF_UFLRA division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division: 1284
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 1193.635 85.516 8
SMTInterpol 0.000 1193.635 87.667 8
Yices2 0.000 1216.226 65.199 6
mathsat-5.4.1n 0.000 1193.635 84.635 8
veriT 0.000 1193.635 84.646 8
z3-4.5.0n 0.000 1216.226 66.684 6

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 1193.635 85.517 85.534 8
SMTInterpol 0.000 1193.635 89.680 86.132 8
Yices2 0.000 1216.226 65.199 65.209 6
mathsat-5.4.1n 0.000 1193.635 84.635 84.651 8
veriT 0.000 1193.635 84.647 84.655 8
z3-4.5.0n 0.000 1216.226 66.684 66.699 6

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.