SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2018

Rules
Benchmarks
Tools
Specs
Participants
Results
Slides
Report

QF_AUFLIA (Unsat Core Track)

Competition results for the QF_AUFLIA division as of Thu Jul 12 23:54:00 GMT

Benchmarks in this division : 516
Time limit: 2400s

Winners

Sequential Performance Parallel Performance
CVC4CVC4

Result table1

Sequential Performance

Solver Error Score Reduction Score avg. CPU time
SMTInterpol 0.000605.8593.087
Yices 2.6.0 0.00013887.2250.193
CVC4 0.00020538.2187.768
mathsat-5.5.2n 0.0001316.85537.832
z3-4.7.1n 0.00019974.7350.257

Parallel Performance

Solver Error Score Reduction Score avg. CPU time avg. WALL time
SMTInterpol 0.000605.8593.0871.771
Yices 2.6.0 0.00013887.2250.1930.194
CVC4 0.00020538.2187.7687.772
mathsat-5.5.2n 0.0001316.85537.83237.835
z3-4.7.1n 0.00019974.7350.2570.257

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.