SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_IDL (Main Track)

Competition results for the QF_IDL division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 2193
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 1828.911 256.450 1802 391
CVC4-experimental-idl-2 3.363 1512.547 412.286 1633 560
MathSATn 0.000 1592.398 365.980 1546 647
SMTInterpol 0.000 1602.734 369.163 1588 605
SMTRAT-Rat 2.570 963.674 672.758 946 1247
Yices 2.6.0 0.000 1919.806 167.532 1937 256
veriT 0.000 1681.665 332.732 1590 603
z3-4.7.1n 0.000 1889.377 202.692 1930 263

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.0001828.911256.451256.4521802391
CVC4-experimental-idl-2 3.3631512.547412.288412.3331633560
MathSATn 0.0001592.398365.982365.9711546647
SMTInterpol 0.0001603.153403.392362.8601590603
SMTRAT-Rat 2.570963.674672.764673.1359461247
Yices 2.6.0 0.0001919.806167.533167.4951937256
veriT 0.0001681.665332.735332.6831590603
z3-4.7.1n 0.0001889.377202.694202.7011930263

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.