Storage Capacity and Solution Space Structure of Fully Connected Two-Layer Neural Networks with Generic Activation Functions
The storage capacity of fully connected two-layer neural networks with general activation functions remains finite even in the infinite width limit, and the weights exhibit negative correlations leading to a division of labor. The system undergoes a phase transition from a permutation symmetric phase to a permutation symmetry broken phase as the dataset size increases.