This paper provides a comprehensive survey and profound perspective on decentralized federated learning (DFL). It begins by reviewing the methodology, challenges, and variants of centralized federated learning (CFL) to establish the background for DFL.
The paper then systematically introduces five key taxonomies of DFL: iteration order, communication protocol, network topology, paradigm proposal, and temporal variability. These taxonomies offer a detailed and insightful understanding of the DFL framework.
Based on the network topology taxonomy, the paper proposes and envisions five variants of DFL to categorize the recent literature, anticipate potential application scenarios, and highlight the advantages of each variant. These variants include line, ring, mesh, star, and hybrid topologies.
Finally, the paper summarizes the current challenges in DFL, such as high communication overhead, computational and storage burden, cybersecurity vulnerability, lack of incentive mechanisms, and management issues. Possible solutions and future research directions are also discussed.
На другой язык
из исходного контента
arxiv.org
Дополнительные вопросы