SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2021

Rules
Benchmarks
Tools
Specs
Parallel & Cloud Tracks
Participants
Results
Slides

QF_LinearIntArith (Model Validation Track)

Competition results for the QF_LinearIntArith division in the Model Validation Track.

Page generated on 2021-07-18 17:31:50 +0000

Benchmarks: 2615
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
cvc5-mvcvc5-mv

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
2020-z3n 0 2449 331938.894 331798.953151 0
z3-mvn 0 2341 457166.028 457088.896274 0
cvc5-mv 0 2307 530224.611 530212.91308 0
2020-Yices2-fixed Model Validationn 0 2264 469273.287 469284.686351 0
2020-Yices2 Model Validationn 0 2264 469313.415 469252.339351 0
Yices2 model-validation 0 2259 475170.794 475163.536356 0
MathSAT5n 0 2252 557628.331 557629.529340 0
SMTInterpol 0 2042 866663.53 840408.306573 0
OpenSMT 0 1752 1107591.294 1213206.21389 0
YicesLS 0 646 88765.195 88693.4526 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
2020-z3n 0 2449331961.694331792.963151 0
z3-mvn 0 2341457203.268457077.756274 0
cvc5-mv 0 2307530266.721530203.05308 0
2020-Yices2 Model Validationn 0 2264469342.955469240.789351 0
2020-Yices2-fixed Model Validationn 0 2264469300.027469273.246351 0
Yices2 model-validation 0 2259475198.484475155.116356 0
MathSAT5n 0 2252557681.151557616.019340 0
SMTInterpol 0 2045867407.81840160.806570 0
OpenSMT 0 17521133519.2681213191.6389 0
YicesLS 0 64688774.95588692.4234 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.