Core Concepts
Federated learning explores model fusion and integration with other learning paradigms, addressing challenges like privacy, communication efficiency, and statistical heterogeneity.
Abstract
Federated learning is a new paradigm that separates data collection from model training. This survey delves into improving federated averaging algorithms and explores model fusion techniques. It also discusses federated learning in conjunction with other paradigms like transfer learning, meta-learning, unsupervised learning, and reinforcement learning. The study highlights the state of the art, challenges, and future directions in federated learning.
Stats
"Vast quantities of data are required for state-of-the-art machine learning algorithms."
"Improving communication efficiency is a critical issue."
"The edge clients provide the supervision signal for supervised machine learning models."
"Label scarcity is one of the problems emblematic of the federated setting."
"The server can be tasked with selecting the most reliable client models of the preceding round."
"Fully unsupervised data can be enhanced via domain adaption."
"FedAvg starts with random initialization or warmed-up model of clients followed by local training."
Quotes
"Local data ownership inherits a basic level of privacy."
"Federated averaging assumes a regularization effect similar to dropout in neural networks."