SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2019

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides

QF_LIA (Unsat Core Track)

Competition results for the QF_LIA division in the Unsat Core Track.

Page generated on 2019-07-23 17:57:45 +0000

Benchmarks: 271
Time Limit: 2400 seconds
Memory Limit: 60 GB

Winners

Sequential PerformanceParallel Performance
Yices 2.6.2Yices 2.6.2

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreAbstainedTimeout Memout
Yices 2.6.2 0 945236 62767.807 62780.02923 0
Z3n 0 944280 99515.69 99527.71837 0
2018-SMTInterpol (unsat core)n 0 927528 57083.875 56289.00321 0
SMTInterpol 0 810967 61136.681 60164.06122 0
CVC4-uc 0 808319 221316.544 221336.24889 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreAbstainedTimeout Memout
Yices 2.6.2 0 94523662777.98762779.09923 0
Z3n 0 94428099525.3399526.13837 0
2018-SMTInterpol (unsat core)n 0 92752857083.87556289.00321 0
SMTInterpol 0 81096761136.68160164.06122 0
CVC4-uc 0 808319221330.924221333.45889 0

n Non-competing.
Abstained: Total of benchmarks in logics in this division that solver chose to abstain from. For SAT/UNSAT scores, this column also includes benchmarks not known to be SAT/UNSAT.