SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

LIA (Main Track)

Competition results for the LIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 388
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
CVC4CVC4

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 388.000 0.060 388 0
Vampire 4.3 0.000 243.278 442.142 221 167
veriT 0.000 146.351 9.187 157 231
z3-4.7.1n 0.000 388.000 0.046 388 0

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.000388.0000.0600.0603880
Vampire 4.3 0.000247.7731721.871433.134227161
veriT 0.000146.3519.1879.168157231
z3-4.7.1n 0.000388.0000.0460.0463880

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.