SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_NIA (Main Track)

Competition results for the QF_NIA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 23876
Time limit: 1200s

Winners

Sequential PerformanceParallel Performance
CVC4CVC4

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
AProVE 0.000 7768.397 491.014 8849 15027
CVC4 0.000 18679.013 289.597 15662 8214
SMTRAT-Rat 7.809 9014.531 731.527 6299 17577
Yices 2.6.0 0.000 17253.973 344.000 16094 7782
z3-4.7.1n 0.000 14657.390 481.294 10387 13489

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
AProVE 0.0007768.612499.276639.335885015026
CVC4 0.00018679.013289.746292.833156628214
SMTRAT-Rat 7.8099014.531731.532731.602629917577
Yices 2.6.0 0.00017253.973344.003344.014160947782
z3-4.7.1n 0.00014657.390481.298481.3241038713489

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.