The International Satisfiability Modulo Theories (SMT) Competition.
Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions
Competition results for the QF_UFLIA logic in the Single Query Track.
Page generated on 2023-07-06 16:04:54 +0000
Benchmarks: 300 Time Limit: 1200 seconds Memory Limit: 60 GB
Sequential Performance | Parallel Performance | SAT Performance (parallel) | UNSAT Performance (parallel) | 24s Performance (parallel) |
---|---|---|---|---|
SMTInterpol | SMTInterpol | SMTInterpol | Yices2 | Yices2 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
SMTInterpol | 0 | 290 | 11673.085 | 9494.785 | 290 | 222 | 68 | 10 | 10 | 0 |
Yices2 | 0 | 280 | 4815.504 | 4817.035 | 280 | 211 | 69 | 20 | 20 | 0 |
2022-z3-4.8.17n | 0 | 275 | 625.872 | 623.9 | 275 | 208 | 67 | 25 | 25 | 0 |
cvc5 | 0 | 275 | 3956.743 | 3953.404 | 275 | 209 | 66 | 25 | 25 | 0 |
OpenSMT | 0 | 271 | 4001.462 | 3971.24 | 271 | 207 | 64 | 29 | 29 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
SMTInterpol | 0 | 290 | 11673.085 | 9494.785 | 290 | 222 | 68 | 10 | 10 | 0 |
Yices2 | 0 | 280 | 4815.504 | 4817.035 | 280 | 211 | 69 | 20 | 20 | 0 |
2022-z3-4.8.17n | 0 | 275 | 625.872 | 623.9 | 275 | 208 | 67 | 25 | 25 | 0 |
cvc5 | 0 | 275 | 3956.743 | 3953.404 | 275 | 209 | 66 | 25 | 25 | 0 |
OpenSMT | 0 | 271 | 4001.462 | 3971.24 | 271 | 207 | 64 | 29 | 29 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
SMTInterpol | 0 | 222 | 7449.777 | 5956.137 | 222 | 222 | 0 | 0 | 78 | 10 | 0 |
Yices2 | 0 | 211 | 2850.5 | 2851.424 | 211 | 211 | 0 | 11 | 78 | 20 | 0 |
cvc5 | 0 | 209 | 1751.27 | 1747.318 | 209 | 209 | 0 | 13 | 78 | 25 | 0 |
2022-z3-4.8.17n | 0 | 208 | 227.805 | 227.453 | 208 | 208 | 0 | 14 | 78 | 25 | 0 |
OpenSMT | 0 | 207 | 2651.148 | 2620.705 | 207 | 207 | 0 | 15 | 78 | 29 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | N/A | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|---|
Yices2 | 0 | 69 | 1965.005 | 1965.611 | 69 | 0 | 69 | 2 | 229 | 20 | 0 |
SMTInterpol | 0 | 68 | 4223.307 | 3538.648 | 68 | 0 | 68 | 3 | 229 | 10 | 0 |
2022-z3-4.8.17n | 0 | 67 | 398.067 | 396.447 | 67 | 0 | 67 | 4 | 229 | 25 | 0 |
cvc5 | 0 | 66 | 2205.473 | 2206.086 | 66 | 0 | 66 | 5 | 229 | 25 | 0 |
OpenSMT | 0 | 64 | 1350.314 | 1350.536 | 64 | 0 | 64 | 7 | 229 | 29 | 0 |
Solver | Error Score | Correct Score | CPU Time Score | Wall Time Score | Solved | Solved SAT | Solved UNSAT | Unsolved | Timeout | Memout |
---|---|---|---|---|---|---|---|---|---|---|
Yices2 | 0 | 268 | 114.361 | 114.928 | 268 | 201 | 67 | 32 | 32 | 0 |
2022-z3-4.8.17n | 0 | 264 | 127.928 | 125.914 | 264 | 203 | 61 | 36 | 36 | 0 |
cvc5 | 0 | 259 | 293.245 | 289.142 | 259 | 202 | 57 | 41 | 41 | 0 |
SMTInterpol | 0 | 258 | 1251.71 | 485.006 | 258 | 201 | 57 | 42 | 42 | 0 |
OpenSMT | 0 | 257 | 573.537 | 542.843 | 257 | 195 | 62 | 43 | 43 | 0 |
n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.