The core message of this work is to develop new compressed federated learning approaches, SCALLION and SCAFCOM, that are practical to implement, robust to arbitrary data heterogeneity and partial client participation, support both biased and unbiased compressors, and exhibit superior theoretical convergence guarantees compared to prior methods.
Shuffling a small fraction of synthetic data across clients can quadratically reduce the gradient dissimilarity and lead to a super-linear speedup in the convergence of federated learning algorithms under data heterogeneity.
The authors propose two novel privacy-preserving federated primal-dual learning algorithms, DP-FedPDM and BSDP-FedPDM, to efficiently solve non-convex and non-smooth federated learning problems while considering communication efficiency and privacy protection.
FedSMOO jointly optimizes for global consistency and a smooth loss landscape to efficiently improve performance in federated learning, especially on heterogeneous datasets.