Federated learning faces challenges due to network issues and data heterogeneity. This paper introduces momentum variants of FEDAVG and SCAFFOLD algorithms to address these challenges. By incorporating momentum, the algorithms achieve faster convergence rates without relying on assumptions about data heterogeneity. The experiments conducted on MLP and ResNet18 models demonstrate the effectiveness of the proposed methods, especially under severe data heterogeneity.
The paper explores the utilization of momentum to enhance the performance of FEDAVG and SCAFFOLD in federated learning scenarios. It introduces novel strategies that are easy to implement, robust to data heterogeneity, and exhibit superior convergence rates. The results from experiments on CIFAR-10 dataset with different neural networks validate the theoretical findings presented in the paper.
Key points:
Para outro idioma
do conteúdo fonte
arxiv.org
Principais Insights Extraídos De
by Ziheng Cheng... às arxiv.org 03-06-2024
https://arxiv.org/pdf/2306.16504.pdfPerguntas Mais Profundas