Core Concepts
The author addresses biased optimization in long-tailed recognition by decoupling positive samples and leveraging patch-based self distillation. This approach aims to improve performance on imbalanced datasets.
Abstract
The content discusses the challenges of long-tailed recognition due to imbalanced datasets and proposes a solution through Decoupled Supervised Contrastive Loss (DSCL) and Patch-based Self Distillation (PBSD). By optimizing intra-category distance and leveraging shared visual patterns, the method aims to enhance performance across different classes. Experimental results demonstrate the effectiveness of the proposed approach, outperforming recent works on various benchmarks.
Key points:
- Supervised Contrastive Loss (SCL) limitations in long-tailed recognition.
- Introduction of DSCL to address biased optimization for head and tail classes.
- Proposal of PBSD to transfer knowledge from head to tail classes using patch-based features.
- Impactful results showcasing improved accuracy on ImageNet-LT dataset.
- Comparison with recent methods highlighting superior performance across different datasets.
Stats
Achieves 57.7% top-1 accuracy on ImageNet-LT dataset.
Performance boosted to 59.7% with ensemble-based method.
Outperforms recent works by 6.5% on long-tailed classification benchmarks.
Quotes
"By optimizing the intra-inter category distance, SCL has achieved impressive performance on balanced datasets."
"To improve the performance on long-tailed recognition, this paper addresses those two issues of SCL by decoupling the training objective."
"Our method is easy to implement and the code will be released to benefit future research."