The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_FPArith division in the Model Validation Track.
Page generated on 2023-07-06 16:06:00 +0000
Benchmarks: 24417 Time Limit: 1200 seconds Memory Limit: 60 GB
Logics:
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Bitwuzla | 0 | 24362 | 6118.691 | 6161.282 | 43 | 0 | |
Bitwuzla Fixedn | 0 | 24361 | 6131.126 | 6203.65 | 43 | 0 | |
cvc5 | 0 | 24351 | 9504.916 | 9466.667 | 56 | 0 | |
2022-Bitwuzlan | 0 | 17289 | 7472.056 | 7513.256 | 39 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Abstained | Timeout | Memout |
---|---|---|---|---|---|---|---|
Bitwuzla | 0 | 24362 | 6118.691 | 6161.282 | 43 | 0 | |
Bitwuzla Fixedn | 0 | 24361 | 6131.126 | 6203.65 | 43 | 0 | |
cvc5 | 0 | 24351 | 9504.916 | 9466.667 | 56 | 0 | |
2022-Bitwuzlan | 0 | 17289 | 7472.056 | 7513.256 | 39 | 0 |
n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.