LDREG: Local Dimensionality Regularized Self-Supervised Learning at ICLR 2024
Core Concepts
Self-supervised learning representations can suffer from dimensional collapse, but LDReg improves representation quality by regularizing local intrinsic dimensionality.
Abstract
Representations learned via self-supervised learning (SSL) can face dimensional collapse, affecting downstream task performance.
LDReg proposes local dimensionality regularization to address this issue.
Theoretical insights on intrinsic dimensionality and Fisher-Rao metric are used to develop LDReg.
LDReg is shown to improve representation quality in various SSL methods through experiments.
Results demonstrate the effectiveness of LDReg in improving linear evaluation, transfer learning, and fine-tuning tasks.
LDReg
Stats
Representations learned via self-supervised learning (SSL) can be susceptible to dimensional collapse.
Dimensional collapse is one of the major causes of degraded performance on downstream tasks.
Previous work has connected dimensional collapse with low quality of learned representations.
Quotes
"Representations can span a high-dimensional space globally but collapse locally." - Hanxun Huang et al., ICLR 2024