SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_ALIA (Main Track)

Competition results for the QF_ALIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 139
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 137.590 24.299 138 1
MathSATn 0.000 0.000 0.383 0 139
SMTInterpol 0.000 139.000 7.604 139 0
Yices 2.6.0 0.000 139.000 0.243 139 0
veriT 0.000 20.566 0.042 16 123
z3-4.7.1n 0.000 127.087 109.143 134 5

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.000137.59024.29924.3101381
MathSATn 0.0000.0000.3830.3750139
SMTInterpol 0.000139.0007.6044.9751390
Yices 2.6.0 0.000139.0000.2430.2441390
veriT 0.00020.5660.0420.04216123
z3-4.7.1n 0.000127.087109.143109.1571345

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.