The content presents a new algorithm called FEderated Stochastic Smoothed Gradient Descent Ascent (FESS-GDA) for solving federated minimax optimization problems. The key contributions are:
FESS-GDA can be uniformly applied to solve several classes of federated nonconvex minimax problems, including Nonconvex-PL (NC-PL), Nonconvex-One-Point-Concave (NC-1PC), Nonconvex-Concave (NC-C), and a special case of Nonconvex-Concave (2).
For NC-PL and NC-SC problems, FESS-GDA achieves a per-client sample complexity of O(κ^2 m^-1 ϵ^-4) and a communication complexity of O(κ ϵ^-2), improving upon the previous best-known results by a factor of O(κ^2) in sample complexity and O(κ) in communication complexity.
For the special case (2), FESS-GDA achieves a per-client sample complexity of O(m^-1 ϵ^-4) and a communication complexity of O(ϵ^-2), which is much better than the complexity for general NC-C problems.
For general NC-C and NC-1PC problems, FESS-GDA achieves comparable performance as the current state-of-the-art algorithms, but with weaker assumptions.
FESS-GDA is the first to provide convergence results for general federated minimax problems with the PL-PL condition, and it achieves better communication complexity compared to previous works.
Experimental results on GAN training and fair classification tasks demonstrate the practical efficiency of FESS-GDA.
翻譯成其他語言
從原文內容
arxiv.org
深入探究