SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_UFLRA (Main Track)

Competition results for the QF_UFLRA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 1284
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
Yices 2.6.0Yices 2.6.0

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 1193.635 85.529 1276 8
MathSATn 0.000 1193.635 84.644 1276 8
SMTInterpol 0.000 1193.635 88.245 1276 8
Yices 2.6.0 0.000 1216.226 64.731 1278 6
veriT 0.000 1193.635 84.660 1276 8
z3-4.7.1n 0.000 1188.904 73.617 1270 14

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.0001193.63585.53085.53912768
MathSATn 0.0001193.63584.64484.65612768
SMTInterpol 0.0001193.63591.07386.45712768
Yices 2.6.0 0.0001216.22664.73164.74012786
veriT 0.0001193.63584.66084.66912768
z3-4.7.1n 0.0001188.90473.61875.481127014

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.