SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFIDL (Main Track)

Competition results for the QF_UFIDL division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 428
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 397.069 119.848 393 35
MathSATn 0.000 417.986 51.752 408 20
SMTInterpol 0.000 419.155 44.279 408 20
Yices 2.6.0 0.000 426.052 7.525 426 2
veriT 0.000 404.878 65.003 396 32
z3-4.7.1n 0.000 426.052 8.937 426 2

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.000397.069119.849119.85939335
MathSATn 0.000417.98651.75251.74440820
SMTInterpol 0.000419.15544.78240.07540820
Yices 2.6.0 0.000426.0527.5257.5264262
veriT 0.000404.87865.00465.00939632
z3-4.7.1n 0.000426.0528.9378.9384262

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.