SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UF (Main Track)

Competition results for the QF_UF division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 7423
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 7273.900 45.392 7411 12
MathSATn 0.000 6750.620 110.244 7282 141
SMTInterpol 0.000 6785.016 131.860 7338 85
Yices 2.6.0 0.000 7410.299 4.906 7422 1
opensmt2 2.436 6512.865 46.959 6801 622
veriT 0.000 7397.598 6.446 7421 2
z3-4.7.1n 0.000 6866.393 103.317 7344 79

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.0007273.90045.39245.393741112
MathSATn 0.0006750.620110.245110.2507282141
SMTInterpol 0.0006785.016137.393129.308733885
Yices 2.6.0 0.0007410.2994.9064.90774221
opensmt2 2.4366512.86546.95946.9826801622
veriT 0.0007397.5986.4466.44474212
z3-4.7.1n 0.0006866.393103.318103.320734479

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.