SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_LinearIntArith (Unsat Core Track)

Competition results for the QF_LinearIntArith division in the Unsat Core Track.

Page generated on 2021-07-18 17:31:25 +0000

Benchmarks: 381
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
cvc5-uccvc5-uc

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
cvc5-uc 0 1404408 212950.301 212959.376165 0
MathSAT5n 0 1309080 219217.415 219272.556179 1
SMTInterpol 0 1267402 137566.657 135809.774104 0
Yices2 0 1049374 145996.373 146015.321110 0
2020-Yices2-fixedn 0 1046504 145558.879 145569.92110 0
z3n 0 972607 188294.508 188319.798154 0
SMTInterpol-remus 0 366734 267198.202 261103.491128 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
cvc5-uc 0 1404408212975.801212952.726165 0
MathSAT5n 0 1309080219263.555219264.296179 1
SMTInterpol 0 1267402137566.657135809.774104 0
Yices2 0 1049374146008.083146010.401110 0
2020-Yices2-fixedn 0 1046504145576.629145565.43110 0
z3n 0 972607188313.528188313.988154 0
SMTInterpol-remus 0 735080267669.222260436.511115 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.