SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_AUFLIA (Main Track)

Competition results for the QF_AUFLIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 1009
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 1009.000 5.467 1009 0
MathSATn 0.000 0.000 24.746 0 1009
SMTInterpol 0.000 1009.000 2.246 1009 0
Yices 2.6.0 0.000 1009.000 0.034 1009 0
veriT 0.000 66.447 0.012 15 994
z3-4.7.1n 0.000 1009.000 0.106 1009 0

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.0001009.0005.4675.46610090
MathSATn 0.0000.00024.74724.74801009
SMTInterpol 0.0001009.0002.2461.35810090
Yices 2.6.0 0.0001009.0000.0340.03610090
veriT 0.00066.4470.0120.01415994
z3-4.7.1n 0.0001009.0000.1060.10610090

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.