SMT-COMP

The International Satisfiability Modulo Theories (SMT) Competition.

GitHub

Home
Introduction
Benchmark Submission
Publications
SMT-LIB
Previous Editions

SMT-COMP 2022

Rules
Benchmarks
Tools
Specs
Proof Exhibition Track
Parallel & Cloud Tracks
Participants
Results
Statistics
Comparisons
Slides

ALIA (Proof Exhibition Track)

Competition results for the ALIA logic in the Proof Exhibition Track.

Page generated on 2022-08-10 11:19:33 +0000

Benchmarks: 41
Time Limit: 1200 seconds
Memory Limit: 60 GB

This track is experimental. Solvers are only ranked by performance, but no winner is selected.

Sequential Performance

Solver Error Score Correct Score CPU Time Score Wall Time ScoreUnsolvedTimeout Memout
cvc5-lfsc 0 41 16.341 16.3400 0
smtinterpol 0 41 108.009 39.30900 0
cvc5 0 35 1681.163 1628.94160 0
veriT 0 27 16801.492 16802.0391414 0

Parallel Performance

Solver Error Score Correct ScoreCPU Time ScoreWall Time ScoreUnsolvedTimeout Memout
cvc5-lfsc 0 4116.34116.3400 0
smtinterpol 0 41108.00939.30900 0
cvc5 0 351681.1631628.94160 0
veriT 0 2716801.83216801.6191414 0

n Non-competing.
N/A: Benchmarks not known to be SAT/UNSAT, respectively.