SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_NIA (Incremental Track)

Competition results for the QF_NIA logic in the Incremental Track.

Page generated on 2023-07-06 16:05:24 +0000

Benchmarks: 12
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Parallel Performance
SMTInterpol

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreUnsolvedTimeout Memout
2021-MathSAT5n 0 41816661726.191476.44375472 0
SMTInterpol 0 41816573730.52810.41375561 0
cvc5 0 173931111578.3211447.52247990211 0
Yices2 Fixedn 0 1929048992.268983.38402630912 0
Yices2 0 1924668996.088986.93402674712 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.