SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2023

Rules
Benchmarks
Specs
Model Validation Track
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_LinearIntArith (Model Validation Track)

Competition results for the QF_LinearIntArith division in the Model Validation Track.

Page generated on 2023-07-06 16:06:00 +0000

Benchmarks: 4958
Time Limit: 1200 seconds
Memory Limit: 60 GB

Logics:

Winners

Sequential PerformanceParallel Performance
Z3++Z3++

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
Z3++ 0 4767 115200.111 114984.689171 8
2022-Z3++n 0 4766 113200.285 113027.729171 8
OpenSMT 0 4597 265005.021 264957.845355 0
cvc5 0 4568 154352.175 154195.064386 0
Yices2 0 4565 82018.756 81906.774390 0
SMTInterpol 0 4269 216584.218 187123.22685 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
Z3++ 0 4767115200.111114984.689171 8
2022-Z3++n 0 4766113200.285113027.729171 8
OpenSMT 0 4597265005.021264957.845355 0
cvc5 0 4568154352.175154195.064386 0
Yices2 0 456582018.75681906.774390 0
SMTInterpol 0 4271219014.858189465.56683 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.