핵심 개념
Flattening long-range loss landscapes in the representation space enhances transferability and fine-tuning in cross-domain few-shot learning.
초록
Cross-domain few-shot learning (CDFSL) aims to leverage prior knowledge from source domains with limited training data in the target domain.
Challenges in CDFSL include transferring knowledge across dissimilar domains and fine-tuning models with limited data.
Analysis of loss landscapes from parameter space to representation space reveals difficulties in transferring and fine-tuning models.
Flattening the loss landscape in the representation space improves model transferability and fine-tuning.
Introducing a new normalization layer, FLoR, to flatten the high-loss region between minima in the representation space.
Experimental results show improved performance on 8 datasets compared to state-of-the-art methods.
통계
Cross-domain few-shot learning (CDFSL) aims to acquire knowledge from limited training data in the target domain.
Experimental results on 8 datasets demonstrate that the approach outperforms state-of-the-art methods in terms of average accuracy.
인용구
"Our contribution is the first to extend the analysis of loss landscapes from the parameter space to the representation space for the CDFSL task."
"Experimental results on 8 datasets demonstrate that our approach outperforms state-of-the-art methods in terms of average accuracy."