SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFLIA (Application Track)

Competition results for the QF_UFLIA division as of Fri Jul 13 00:00:31 GMT

Benchmarks in this division : 780
Time limit: 2400s

Winner : SMTInterpol

Result table1

Solver Error Score Correctly Solved Score avg. CPU time avg. WALL time
CVC4 0769303445920.41445925.39
mathsat-5.5.2n 0770526367563.31367575.57
SMTInterpol 0771844469999.17436150.90
Yices2 0769823375437.89375450.33
z3-4.7.1n 0778554210086.31210077.63

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.