SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFLIA (Main Track)

Competition results for the QF_UFLIA division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division: 583
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 583.000 1.291 0
SMTInterpol 0.000 583.000 4.004 0
Yices2 0.000 583.000 0.152 0
mathsat-5.4.1n 0.000 580.957 10.563 2
veriT 0.000 522.318 112.109 72
z3-4.5.0n 0.000 583.000 0.175 0

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 583.000 1.291 1.294 0
SMTInterpol 0.000 583.000 4.004 2.405 0
Yices2 0.000 583.000 0.152 0.157 0
mathsat-5.4.1n 0.000 580.957 10.563 10.576 2
veriT 0.000 522.318 112.111 112.130 72
z3-4.5.0n 0.000 583.000 0.175 0.178 0

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.