This paper explores the impact of mobility on the convergence performance of asynchronous Federated Learning (FL) by utilizing opportunistic mobile relaying. Clients can indirectly communicate with the server through other clients serving as relays, enabling quicker model updates and fresher global models. The proposed FedMobile algorithm addresses key questions on when and how to relay, achieving a convergence rate of O(1/√NT). By considering data manipulation before relaying, costs are reduced and privacy is enhanced. Experimental results confirm the theoretical findings on synthetic and real-world datasets.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Jieming Bian... at arxiv.org 03-19-2024
https://arxiv.org/pdf/2206.04742.pdfDeeper Inquiries