Local Loss Optimization in Infinite-Width Neural Networks: Analyzing Stable Parameterization for Predictive Coding and Target Propagation
This research paper investigates the stable parameterization of local learning algorithms, specifically Predictive Coding (PC) and Target Propagation (TP), in the infinite-width limit of neural networks, revealing unique properties and highlighting their potential for large-scale deep learning.