SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2017

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_NIA (Main Track)

Competition results for the QF_NIA division as of Fri Jul 21 10:18:02 GMT

Benchmarks in this division: 23876
Time Limit: 1200s

Winners

Sequential Performance Parallel Performance
CVC4 CVC4

Result table1

Sequential Performance

Solver Error Score Correct Score CPU Time Unsolved
AProVE 0.000 7762.332 480.255 15026
CVC4 0.000 17512.574 336.305 10177
SMTRAT 0.000 10189.752 667.870 17263
Yices2 0.000 17092.777 356.094 7899
z3-4.5.0n 0.000 13892.594 496.172 14082

Parallel Performance

Solver Error Score Correct Score CPU Score WALL Score Unsolved
AProVE 0.000 7764.361 489.059 640.149 15023
CVC4 0.000 17512.574 336.359 337.423 10177
SMTRAT 0.000 10189.752 667.875 667.949 17263
Yices2 0.000 17092.777 356.101 356.157 7899
z3-4.5.0n 0.000 13892.594 496.175 496.270 14082

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.