SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LIA (Main Track)

Competition results for the QF_LIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 6947
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
SPASS-SATTSPASS-SATT

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 5891.019 194.986 6357 590
Ctrl-Ergo 0.000 6221.467 156.086 6259 688
MathSATn 0.000 6135.114 164.626 6528 419
SMTInterpol 0.000 5915.623 204.123 6286 661
SMTRAT-Rat 0.000 4049.914 515.394 3112 3835
SPASS-SATT 0.000 6587.626 72.048 6744 203
Yices 2.6.0 0.000 5867.976 209.452 6232 715
veriT 0.000 3155.162 295.434 2734 4213
z3-4.7.1n 0.000 5733.374 224.539 6195 752

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.0005891.019194.988194.9826357590
Ctrl-Ergo 0.0006307.810355.78489.5386357590
MathSATn 0.0006135.114164.627164.6246528419
SMTInterpol 0.0005920.370215.278199.1676291656
SMTRAT-Rat 0.0004049.914515.401515.43031123835
SPASS-SATT 0.0006587.62672.04872.0376744203
Yices 2.6.0 0.0005867.976209.455209.4546232715
veriT 0.0003155.162295.437295.44827344213
z3-4.7.1n 0.0005733.374224.541224.5156195752

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.