Core Concepts
This paper proposes a method to accurately compute the local Lipschitz constant of feedforward neural networks with ReLU activation functions, and derives a condition to verify the exactness of the computed upper bound.
Abstract
The paper focuses on computing the local Lipschitz constant of feedforward neural networks (FNNs) with rectified linear unit (ReLU) activation functions. It makes the following key contributions:
It introduces a new set of copositive multipliers that can accurately capture the behavior of ReLUs, and shows that this set encompasses existing multiplier sets used in prior work.
It formulates the upper bound computation of the local Lipschitz constant as a semidefinite programming (SDP) problem using the copositive multipliers.
By analyzing the dual of the SDP, it derives a rank condition on the dual optimal solution that enables verifying the exactness of the computed upper bound. This also allows extracting the worst-case input that maximizes the deviation from the original output.
To handle practical FNNs with hundreds of ReLUs, which make the original SDP intractable, it proposes a method to construct a reduced-order model whose input-output behavior is identical to the original FNN around the target input.
The paper demonstrates the effectiveness of the proposed methods through numerical examples on both academic and practical FNN models.
Stats
The local Lipschitz constant Lw0,ε is defined as the minimum L such that |G(w) - G(w0)|2 ≤ L ∀w ∈ Bε(w0), where G is the FNN and w0 is the target input.
The reduced-order model Gr has nr ReLUs, where nr << n, the number of ReLUs in the original FNN G.