Core Concepts
Exploiting mobility for faster convergence in asynchronous federated learning.
Abstract
This paper explores the impact of mobility on the convergence performance of asynchronous Federated Learning (FL) by utilizing opportunistic mobile relaying. Clients can indirectly communicate with the server through other clients serving as relays, enabling quicker model updates and fresher global models. The proposed FedMobile algorithm addresses key questions on when and how to relay, achieving a convergence rate of O(1/√NT). By considering data manipulation before relaying, costs are reduced and privacy is enhanced. Experimental results confirm the theoretical findings on synthetic and real-world datasets.
Stats
FedMobile achieves a convergence rate O(1/√NT).
The proposed method outperforms state-of-the-art asynchronous FL methods by reducing time consumption significantly.