Core Concepts
Optimizing communication-computation costs in Federated Learning over wireless networks is crucial for efficient training.
Abstract
The content discusses the challenges of distributed training in Federated Learning over wireless networks and proposes the FedCau algorithm to address communication-computation costs efficiently. It introduces a proactive stop policy to optimize training performance and networking costs. The algorithm is applied to various communication protocols and datasets, showing improved efficiency.
Structure:
Introduction to Federated Learning challenges
Proposed FedCau algorithm for cost-efficient training
Application of FedCau to different scenarios and datasets
Importance of communication-computation cost optimization
Stats
"We show that, given a total cost budget, the training performance degrades as either the background communication traffic or the dimension of the training problem increases."
"Our extensive results show that the FedCau methods can save the valuable resources one would spend through unnecessary iterations of FL, even when applied on top of existing methods from literature focusing on resource allocation problems."
Quotes
"We conclude that cost-efficient stopping criteria are essential for the success of practical FL over wireless networks."