Keskeiset käsitteet
FedSMOO jointly optimizes for global consistency and a smooth loss landscape to efficiently improve performance in federated learning, especially on heterogeneous datasets.
Tiivistelmä
The paper proposes a novel federated learning algorithm called FedSMOO that jointly considers both global consistency and a smooth loss landscape as optimization targets.
Key highlights:
- FedSMOO adopts a dynamic regularizer to align local optima with the global objective, while also using a global Sharpness-Aware Minimization (SAM) optimizer to search for consistent flat minima.
- Theoretical analysis shows FedSMOO achieves a fast O(1/T) convergence rate without the typical assumption of bounded heterogeneous gradients, and provides a generalization bound.
- Extensive experiments on CIFAR-10/100 datasets demonstrate FedSMOO outperforms several baselines, especially on highly heterogeneous data, by efficiently converging to a better minimum with a smoother loss landscape.
- FedSMOO also has lower communication costs compared to advanced federated learning methods.
Tilastot
The paper does not provide any specific numerical data or metrics in the main text. The key figures and results are presented in a qualitative manner.
Lainaukset
"FedSMOO jointly considers both consistency and a global flat landscape."
"FedSMOO achieves fast O(1/T) convergence rate without the general assumption of bounded heterogeneous gradients."
"FedSMOO outperforms several baselines, especially on highly heterogeneous data, by efficiently converging to a better minimum with a smoother loss landscape."