Efficient Pruning by Leveraging Saturation of Neurons in Neural Networks
The author explores the potential of leveraging dying neurons in neural networks for efficient model compression and optimization through a method called Demon Pruning (DemP), which combines regularization and noise injection to control the proliferation of dead neurons. This approach demonstrates superior accuracy-sparsity tradeoffs and training speedups compared to existing structured pruning techniques.