Temel Kavramlar
The author proposes a function space called the Neural Hilbert Ladder (NHL) that can characterize the functions representable by multi-layer neural networks with arbitrary width. The NHL space is defined as an infinite union of reproducing kernel Hilbert spaces (RKHSs) and is associated with a complexity measure that governs both the approximation and generalization properties of neural networks.
Özet
The paper introduces the concept of a Neural Hilbert Ladder (NHL), which is a hierarchy of RKHSs constructed by interleaving them with random fields and kernel functions. The author shows that:
- Every function representable by a multi-layer neural network (NN) with arbitrary width belongs to the NHL space, and the NHL complexity measure upper-bounds the NN's approximation cost.
- Conversely, any function in the NHL space can be approximated efficiently by a multi-layer NN.
- The NHL space and its complexity measure can be used to derive generalization guarantees for learning with multi-layer NNs.
- Under the ReLU activation, the NHL space exhibits a strict depth separation, where the 3-layer NHL space is strictly larger than the 2-layer NHL space.
- In the infinite-width mean-field limit, the training of multi-layer NNs corresponds to a non-Markovian learning dynamics in the NHL space, which exhibits feature learning beyond the fixed kernel behavior of the Neural Tangent Kernel.
The author provides a comprehensive theoretical analysis of the NHL framework and demonstrates its advantages over prior approaches in characterizing the function space of multi-layer NNs.