SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_BV (Unsat Core Track)

Competition results for the QF_BV division as of Thu Jul 12 23:54:00 GMT

Benchmarks in this division : 25700
Time limit: 2400s

Winners

Sequential Performance Parallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time
Yices 2.6.0 0.00014706936.731559.547
CVC4 0.00014607999.737428.874
mathsat-5.5.2n 0.00014681317.650563.789
z3-4.7.1n 0.00014601324.711641.502

Parallel Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
Yices 2.6.0 0.00014706936.731559.550559.600
CVC4 0.00014607999.737428.874434.869
mathsat-5.5.2n 0.00014681317.650563.790563.833
z3-4.7.1n 0.00014601324.711641.506641.499

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.