SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_UF (Unsat Core Track)

Competition results for the QF_UF logic in the Unsat Core Track.

Page generated on 2021-07-18 17:31:25 +0000

Benchmarks: 2061
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
SMTInterpolSMTInterpol

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreTimeout Memout
z3n 0 239021 4354.298 4347.2290 0
2020-SMTInterpol-fixedn 0 238609 14240.346 8338.9680 0
SMTInterpol 0 238158 23034.853 12478.8820 0
cvc5-uc 0 237770 4915.613 4896.4660 0
2020-Yices2-fixedn 0 237670 3725.328 3685.8550 0
Yices2 0 237670 3904.075 3846.0850 0
MathSAT5n 0 233468 1841.501 1824.5720 0
SMTInterpol-remus 0 72337 1075464.527 1022241.16434 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreTimeout Memout
z3n 0 2390214354.2984347.2290 0
2020-SMTInterpol-fixedn 0 23860914240.3468338.9680 0
SMTInterpol 0 23815823034.85312478.8820 0
cvc5-uc 0 2377704915.6134896.4660 0
2020-Yices2-fixedn 0 2376703725.3283685.8550 0
Yices2 0 2376703904.0753846.0850 0
MathSAT5n 0 2334681841.5011824.5720 0
SMTInterpol-remus 0 2168601091544.597997315.6632 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.