Enhancing Neural Network Generalization and Calibration through Systematic Noise Injection Evaluation
Systematic exploration of diverse noise injection methods reveals that certain noise types, such as AugMix, weak augmentation, and Dropout, can effectively improve both the generalization and calibration of neural networks across various datasets, tasks, and architectures. The findings emphasize the need for tailored noise approaches for specific domains and careful hyperparameter tuning when combining multiple noises.