SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2016

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_NIA (Unknown Benchmarks Track)

Competition results for the QF_NIA division as of Thu Jul 7 07:28:02 GMT

Benchmarks in this division: 927
Time Limit: 1200s

Non-Competitive division

Result table

Sequential Performance

Solver Solved avg. CPU time avg. WALL time
AProVE 3 449315.19 442892.63
CVC4 31 518164.36 517934.15
ProB 13 275286.20 275028.92
SMT-RAT 202 438463.92 438224.91
Yices2 694 143649.17 143549.23
raSAT 0.3 6 551869.02 551551.42
raSAT 0.4 14 1094024.83 545627.15
z3n 244 204345.09 204075.89

n. Non-competitive.