The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_Equality+LinearArith division in the Model Validation Track.
Page generated on 2022-08-10 11:19:11 +0000
Benchmarks: 891 Time Limit: 1200 seconds Memory Limit: 60 GB
Logics:Sequential Performance | Parallel Performance |
---|---|
smtinterpol | smtinterpol |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
smtinterpol | 0 | 880 | 26314.573 | 22993.201 | 10 | 0 | |
2021-SMTInterpoln | 0 | 880 | 27162.66 | 23792.868 | 10 | 0 | |
cvc5 | 0 | 853 | 74790.819 | 74808.537 | 38 | 0 | |
z3-4.8.17n | 0 | 852 | 52266.91 | 52116.123 | 39 | 0 | |
Yices2 | 0 | 825 | 95264.222 | 95281.668 | 66 | 0 | |
OpenSMT | 0 | 681 | 6818.761 | 6792.508 | 4 | 0 | |
MathSATn | 0 | 538 | 125071.898 | 125098.828 | 103 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
smtinterpol | 0 | 880 | 26314.573 | 22993.201 | 10 | 0 | |
2021-SMTInterpoln | 0 | 880 | 27162.66 | 23792.868 | 10 | 0 | |
cvc5 | 0 | 853 | 74799.919 | 74806.607 | 38 | 0 | |
z3-4.8.17n | 0 | 852 | 52269.11 | 52115.463 | 39 | 0 | |
Yices2 | 0 | 825 | 95274.722 | 95278.988 | 66 | 0 | |
OpenSMT | 0 | 681 | 6820.251 | 6792.308 | 4 | 0 | |
MathSATn | 0 | 538 | 125093.478 | 125093.868 | 103 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.