Core Concepts
The core message of this work is to develop new compressed federated learning approaches, SCALLION and SCAFCOM, that are practical to implement, robust to arbitrary data heterogeneity and partial client participation, support both biased and unbiased compressors, and exhibit superior theoretical convergence guarantees compared to prior methods.
Abstract
The paper addresses the challenges in federated learning (FL) due to severe data heterogeneity, partial client participation, and heavy communication workload. It proposes two new algorithms, SCALLION and SCAFCOM, that build upon the SCAFFOLD method to achieve enhanced communication efficiency, faster convergence rates, and robustness to arbitrary data heterogeneity and partial participation.
Key highlights:
SCALLION revisits the SCAFFOLD method and presents a simplified implementation that reduces the uplink communication cost by half. It employs unbiased compressors and achieves state-of-the-art convergence rates.
SCAFCOM enables the use of biased compressors by incorporating local momentum, further improving the communication and computation complexities, especially under aggressive compression.
The theoretical analysis in this work only requires standard smoothness and bounded gradient variance assumptions, without any additional restrictive conditions on data heterogeneity or compression errors, unlike prior related works.
Experiments demonstrate that SCALLION and SCAFCOM can match the performance of full-precision FL approaches with substantially reduced uplink communication, and outperform recent compressed FL methods under the same communication budget.
Stats
The paper does not provide any specific numerical data or metrics to support the key claims. The theoretical analysis focuses on establishing convergence rates and complexities.