SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

UFNIA (Incremental Track)

Competition results for the UFNIA logic in the Incremental Track.

Page generated on 2021-07-18 17:30:28 +0000

Benchmarks: 2031
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Parallel Performance
cvc5-inc

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreUnsolvedTimeout Memout
2020-z3n 0 887231371089.1241371476.2432579881011 1
z3n 0 886601369251.9341369623.3112580511005 2
cvc5-inc 0 331811006156.0161006033.357313530821 0
2020-CVC4-incn 0 29859971655.327971548.538316852797 0
SMTInterpol 0 154851362244.6971346276.6843312261088 0
UltimateEliminator+MathSAT 0 08901.8843929.8553467110 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.