SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_LRA (Main Track)

Competition results for the QF_LRA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 1626
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
CVC4 0.000 1601.997 61.989
MathSat5n 0.000 1574.475 109.915
OpenSMT2 0.000 1510.710 214.228
SMT-RAT 0.000 1415.279 344.351
SMTInterpol 0.000 1571.240 118.790
Yices2 0.000 1593.356 65.700
toysmt 0.000 1172.885 582.313
veriT-dev 0.000 1577.045 95.717
z3n 0.000 1547.832 157.261

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
CVC4 0.000 1601.997 62.004 62.088 20
MathSat5n 0.000 1574.475 109.957 109.870 64
OpenSMT2 0.000 1510.710 214.323 214.198 263
SMT-RAT 0.000 1415.279 344.538 344.329 433
SMTInterpol 0.000 1572.061 127.002 112.669 54
Yices2 0.000 1593.356 65.730 65.654 34
toysmt 0.000 1172.885 582.500 582.280 811
veriT-dev 0.000 1577.045 95.759 95.701 55
z3n 0.000 1547.832 157.327 157.232 145

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.