SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

UFLIA (Unsat Core Track)

Competition results for the UFLIA logic in the Unsat Core Track.

Page generated on 2021-07-18 17:31:25 +0000

Benchmarks: 3862
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
cvc5-uccvc5-uc

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
cvc5-uc 0 923558 174712.363 174688.71131 0
2020-CVC4-ucn 0 922036 202177.198 202177.266154 0
2020-z3n 0 870462 270568.604 270782.548157 6
z3n 0 868471 276338.608 276321.174157 2
Vampire 0 828021 309420.654 239685.964179 0
SMTInterpol 0 597007 1641734.328 1631418.4511317 0
SMTInterpol-remus 0 496919 2759845.82 2681492.821706 0
UltimateEliminator+MathSAT 2 2443 38405.967 29421.97911 1

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
cvc5-uc 0 923558174736.093174683.18131 0
2020-CVC4-ucn 0 922036202213.818202170.276154 0
2020-z3n 0 870462270875.024270775.718157 6
z3n 0 868471276369.468276314.344157 2
Vampire 0 832035364620.884207132.509128 0
SMTInterpol 0 5983631662024.2681626806.6421296 0
SMTInterpol-remus 0 5921032834085.172647684.6081303 0
UltimateEliminator+MathSAT 2 244338405.96729421.97911 1

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.