SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

QF_NIRA (Single Query Track)

Competition results for the QF_NIRA logic in the Single Query Track.

Page generated on 2022-08-10 11:17:45 +0000

Benchmarks: 2
Time Limit: 1200 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel PerformanceSAT Performance (parallel)UNSAT Performance (parallel)24s Performance (parallel)
cvc5cvc5 cvc5

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
z3-4.8.17n 0 1 1203.194 1203.19310111 0
cvc5 0 1 1411.549 1412.110111 0
MathSATn 0 1 1442.408 1442.76310111 0
Yices2 0 0 2399.69 2400.0600022 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
z3-4.8.17n 0 11203.1941203.19310111 0
cvc5 0 11412.0091412.0710111 0
MathSATn 0 11442.6281442.74310111 0
Yices2 0 02400.02400.000022 0

SAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
cvc5 0 00.00.0000021 0
Yices2 0 00.00.0000022 0
MathSATn 0 00.00.0000021 0
z3-4.8.17n 0 00.00.0000021 0

UNSAT Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedN/ATimeout Memout
z3-4.8.17n 0 13.1943.193101011 0
cvc5 0 1212.009212.07101011 0
MathSATn 0 1242.628242.743101011 0
Yices2 0 01200.01200.0000112 0

24 seconds Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreSolvedSolved SATSolved UNSATUnsolvedTimeout Memout
z3-4.8.17n 0 127.19427.19310111 0
cvc5 0 048.048.000022 0
Yices2 0 048.048.000022 0
MathSATn 0 048.048.000022 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.