SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

LRA (Main Track)

Competition results for the LRA division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division: 2419
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 2120.921 182.402 371
Redlog 0.000 1766.425 225.549 673
vampire 4.2 0.000 1072.149 681.678 1384
veriT+Redlog 0.000 1991.606 221.676 515
z3-4.5.0n 0.000 2248.185 110.671 211

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 2120.921 182.501 182.502 371
Redlog 0.000 1766.425 225.551 225.570 673
vampire 4.2 0.000 1120.945 2176.048 552.336 1327
veriT+Redlog 0.000 1991.606 221.678 221.730 515
z3-4.5.0n 0.000 2248.185 110.672 110.724 211

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.