SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_NIA (Main Track)

Competition results for the QF_NIA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division: 8593
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
Yices2 Yices2

Result table1

Sequential Performance

Solver Error Score Correct Score avg. CPU time
AProVE 0.000 4678.012 811.752
CVC4 0.000 6790.593 231.114
ProB 0.000 3197.237 991.049
SMT-RAT 0.000 6229.991 507.871
Yices2 0.000 7408.849 331.911
raSAT 0.3 0.000 3305.014 1164.377
raSAT 0.4 0.000 4213.861 771.217
z3n 0.000 8010.537 170.584

Parallel Performance

SolverError Score Correct Score avg. CPU time avg. WALL time Unsolved
AProVE 0.000 4678.012 831.329 824.487 320
CVC4 0.000 6790.593 231.134 231.484 362
ProB 0.000 3197.237 991.652 991.061 1036
SMT-RAT 0.000 6229.991 508.136 507.882 150
Yices2 0.000 7408.849 332.101 331.918 142
raSAT 0.3 598.828 3305.014 1165.022 1165.667 919
raSAT 0.4 0.000 4308.662 1237.141 889.778 576
z3n 0.000 8010.537 170.671 170.508 27

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.