Core Concepts
Test error exhibits double descent under distribution shift, providing insights for data augmentation and the role of noise as an implicit regularizer.
Abstract
The study explores supervised denoising and noisy-input regression under distribution shift, considering low-rank data matrices. The theoretical analysis provides instance-specific expressions for test error, showcasing benign, tempered, or catastrophic overfitting. Real-life data experiments validate the theoretical predictions with minimal MSE error for low-rank data.
Stats
We show that the test error exhibits double descent under general distribution shift.
The relative error between the generalization error estimate and the average empirical error is under 1% on average.