The key insights and findings of the content are:
The authors observe that a natural and complementary notion of complexity of a continuous function F: Rn → R is the number of homeomorphism classes of its sublevel sets F≤a for a ∈ R.
For smooth functions, Morse theory provides a toolkit for computing the algebro-topological invariants of the sublevel sets and understanding how they change as the threshold varies. However, for piecewise-linear (PL) functions realized by ReLU neural networks, a more delicate analysis is required.
The authors associate to each (connected component of) flat cell(s) K at level t its local homological complexity, defined as the rank of the relative homology of the pair (F≤t, F≤t\K). They then define the total H-complexity of a finite PL map as the sum of all the local H-complexities.
The authors prove that if the level set complex C(F)F∈[a,b] contains no flat cells, then the sublevel sets F≤a and F≤b are homotopy equivalent, as are the superlevel sets F≥a and F≥b. This allows them to give a coarse description of the topological complexity of F in terms of the Betti numbers of the sublevel sets F≤a for a << 0 and the superlevel sets F≥a for a >> 0.
The authors construct a canonical polytopal complex K(F) and a deformation retraction from the domain of F to K(F), which allows them to compute the homology of the sublevel and superlevel sets efficiently.
Finally, the authors present a construction showing that the local H-complexity of a ReLU neural network function can be arbitrarily large.
Sang ngôn ngữ khác
từ nội dung nguồn
arxiv.org
Thông tin chi tiết chính được chắt lọc từ
by J. Elisenda ... lúc arxiv.org 04-03-2024
https://arxiv.org/pdf/2204.06062.pdfYêu cầu sâu hơn