SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_Equality+LinearArith (Unsat Core Track)

Competition results for the QF_Equality+LinearArith division in the Unsat Core Track.

Page generated on 2021-07-18 17:31:25 +0000

Benchmarks: 577
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
Yices2Yices2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
MathSAT5n 0 858722 59814.437 59839.6832 0
2020-Yices2-fixedn 0 814234 74035.403 74048.05942 0
Yices2 0 814234 75150.362 75162.45442 0
2020-z3n 0 456356 85540.187 85554.32159 0
SMTInterpol 0 371701 102530.616 98995.33170 0
SMTInterpol-remus 0 365439 160585.778 153953.04788 0
z3n 0 333851 102081.846 102098.41966 0
2020-CVC4-ucn 0 204838 102555.808 102570.24169 0
cvc5-uc 0 45192 120273.425 120291.00395 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
MathSAT5n 0 85872259827.47759838.4332 0
2020-Yices2-fixedn 0 81423474040.10374046.44942 0
Yices2 0 81423475155.61275161.04442 0
2020-z3n 0 45635685549.21785551.95159 0
SMTInterpol-remus 0 396803161131.678152651.43769 0
SMTInterpol 0 382053102583.86698907.36169 0
z3n 0 333851102089.356102095.45966 0
2020-CVC4-ucn 0 204838102566.948102567.47169 0
cvc5-uc 0 45192120287.925120286.67395 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.