Stochastic Subsampling with Average Pooling: A Regularization Technique for Deep Neural Networks
Stochastic average pooling, which combines stochastic subsampling, average pooling, and √p scaling, provides a Dropout-like regularization effect without introducing inconsistency issues, and can be seamlessly integrated into existing deep neural network architectures.