SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_BV (Main Track)

Competition results for the QF_BV division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division : 40043
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Boolector+CaDiCaL MinkeyRink

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
Boolector 0.000 35203.685 199.654 961
Boolector+CaDiCaL 0.000 35208.875 191.938 672
CVC4 0.000 33273.501 263.500 1578
MinkeyRink 0.000 34985.527 202.494 663
Q3B 0.000 15657.068 662.163 12207
Yices2 0.000 32565.678 269.683 1059
mathsat-5.4.1n 74.589 32026.942 280.398 1515
stp_mt 0.000 33632.222 249.733 1785
stp_st 0.000 34546.157 214.694 1429
z3-4.5.0n 0.000 33377.675 239.156 2764

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
Boolector 0.000 35203.685 199.655 200.128 961
Boolector+CaDiCaL 0.000 35208.875 191.939 192.252 672
CVC4 0.000 34121.406 442.690 223.132 1334
MinkeyRink 0.000 35618.540 343.860 175.126 558
Q3B 0.000 15858.232 1860.536 645.642 11994
Yices2 0.000 32565.678 269.686 270.067 1059
mathsat-5.4.1n 74.589 32026.942 280.398 280.987 1515
stp_mt 0.000 34980.761 675.628 194.640 1280
stp_st 0.000 34546.157 214.695 215.046 1429
z3-4.5.0n 0.000 33377.675 239.158 239.451 2764

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.