SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFLIA (Main Track)

Competition results for the QF_UFLIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 583
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 583.000 1.370 583 0
MathSATn 0.000 578.914 15.939 579 4
SMTInterpol 0.000 583.000 3.450 583 0
Yices 2.6.0 0.000 583.000 0.121 583 0
veriT 0.000 535.715 101.256 524 59
z3-4.7.1n 0.000 583.000 0.176 583 0

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.000583.0001.3701.3595830
MathSATn 0.000578.91415.93915.9395794
SMTInterpol 0.000583.0003.4502.0205830
Yices 2.6.0 0.000583.0000.1210.1225830
veriT 0.000535.715101.257101.17952459
z3-4.7.1n 0.000583.0000.1760.1745830

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.