SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

LRA (Main Track)

Competition results for the LRA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 339
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 337.522 15.871
vampire_smt_4.1 0.000 325.231 38.268
vampire_smt_4.1_parallel 0.000 323.753 108.677
veriT-dev 0.000 198.834 666.723
z3n 0.000 331.947 52.312

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 337.522 15.876 15.858 1
vampire_smt_4.1 0.000 325.231 38.268 38.095 7
vampire_smt_4.1_parallel 0.000 323.753 178.671 46.309 8
veriT-dev 0.000 198.834 667.022 666.697 61
z3n 0.000 331.947 52.341 52.310 4

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.