Robust Neural Pruning with Gradient Sampling Optimization Maintains High Accuracy in Residual Neural Networks
Gradient sampling optimization techniques, such as StochGradAdam, can significantly preserve accuracy during and after the pruning process of neural networks compared to traditional optimization methods.