Core Concepts
Federated Postponed Broadcast (FedPBC) algorithm converges to a stationary point of the non-convex global objective in the presence of stochastic and dynamic communication failures, without requiring any "balancedness" assumption on the uplink availability.
Abstract
The paper studies federated learning in the presence of stochastic and dynamic communication failures, where the uplink between the parameter server and client i is active with unknown probability pt
i in round t. The authors first demonstrate that the widely adopted Federated Average (FedAvg) algorithm experiences significant bias when the pt
i's vary across clients. To address this, they propose Federated Postponed Broadcast (FedPBC), a simple variant of FedAvg that postpones broadcasting the global model till the end of each round.
The key insights are:
FedPBC enables implicit gossiping among the clients with active links in each round, which helps bound the perturbation caused by non-uniform and time-varying pt
i.
FedPBC converges to a stationary point of the non-convex global objective, without requiring any "balancedness" assumption on pt
i's or bounded stochastic gradients/noises.
Extensive experiments on real-world datasets validate the analysis, showing FedPBC outperforms multiple baselines under diverse unreliable uplink patterns.
Stats
The paper does not provide any specific numerical data or statistics to support the key claims. The analysis is mostly theoretical, with numerical experiments conducted on real-world datasets to validate the performance of the proposed FedPBC algorithm.
Quotes
The paper does not contain any striking quotes that support the key logics.