SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_IDL (Main Track)

Competition results for the QF_IDL division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division: 2193
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
CVC4 0.000 1847.673 251.028 379
SMTInterpol 0.000 1615.537 364.940 594
Yices2 0.000 1918.061 173.183 257
veriT 0.000 1675.739 336.234 608
z3-4.5.0n 0.000 1902.892 198.312 256

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
CVC4 0.000 1847.673 251.029 251.181 379
SMTInterpol 0.000 1617.935 395.322 358.225 592
Yices2 0.000 1918.061 173.185 173.212 257
veriT 0.000 1675.739 336.238 336.243 608
z3-4.5.0n 0.000 1902.892 198.313 198.340 256

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.