SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_AUFLIA (Unsat Core Track)

Competition results for the QF_AUFLIA logic in the Unsat Core Track.

Page generated on 2021-07-18 17:31:25 +0000

Benchmarks: 300
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
cvc5-uccvc5-uc

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
2020-CVC4-ucn 0 24996 376.118 376.7320 0
z3n 0 24250 19.301 21.480 0
2020-z3n 0 24250 24.395 24.4450 0
cvc5-uc 0 20101 1383.468 1381.4630 0
Yices2 0 18845 17.543 18.9160 0
2020-Yices2-fixedn 0 18845 17.581 20.0070 0
SMTInterpol-remus 0 12303 36202.7 33755.45713 0
MathSAT5n 0 2329 98.577 101.9830 0
SMTInterpol 0 1380 589.523 256.7130 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
2020-CVC4-ucn 0 24996376.118376.7320 0
z3n 0 2425019.30121.480 0
2020-z3n 0 2425024.39524.4450 0
cvc5-uc 0 201011383.4681381.4630 0
Yices2 0 1884517.54318.9160 0
2020-Yices2-fixedn 0 1884517.58120.0070 0
SMTInterpol-remus 0 1621136567.4832978.8770 0
MathSAT5n 0 232998.577101.9830 0
SMTInterpol 0 1380589.523256.7130 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.