The paper focuses on computing the local Lipschitz constant of feedforward neural networks (FNNs) with rectified linear unit (ReLU) activation functions. It makes the following key contributions:
It introduces a new set of copositive multipliers that can accurately capture the behavior of ReLUs, and shows that this set encompasses existing multiplier sets used in prior work.
It formulates the upper bound computation of the local Lipschitz constant as a semidefinite programming (SDP) problem using the copositive multipliers.
By analyzing the dual of the SDP, it derives a rank condition on the dual optimal solution that enables verifying the exactness of the computed upper bound. This also allows extracting the worst-case input that maximizes the deviation from the original output.
To handle practical FNNs with hundreds of ReLUs, which make the original SDP intractable, it proposes a method to construct a reduced-order model whose input-output behavior is identical to the original FNN around the target input.
The paper demonstrates the effectiveness of the proposed methods through numerical examples on both academic and practical FNN models.
Sang ngôn ngữ khác
từ nội dung nguồn
arxiv.org
Thông tin chi tiết chính được chắt lọc từ
by Yoshio Ebiha... lúc arxiv.org 04-09-2024
https://arxiv.org/pdf/2310.11104.pdfYêu cầu sâu hơn