The author proposes HyperFedNet (HFN) as an innovative approach to address challenges in Federated Learning by leveraging hypernetworks, reducing communication costs, and enhancing security.
The author presents Federated Prompts Cooperation via Optimal Transport (FedOTP) as a solution to address data heterogeneity in federated learning by integrating global and local prompts using unbalanced Optimal Transport. FedOTP effectively balances global consensus and local personalization, outperforming state-of-the-art methods.
FedRDMA proposes a communication-efficient system integrating RDMA into federated learning, achieving up to 3.8× speedup in communication efficiency compared to traditional TCP/IP-based systems.
The author advocates for improving federated learning's generalization by addressing the challenge of non-participating clients through innovative methods.
The author proposes a contribution-aware asynchronous Federated Learning method to address the challenges of slow and unreliable communication in realistic settings, offering a more efficient solution by dynamically adjusting update contributions based on staleness and statistical heterogeneity.
The authors propose adaptive gradient methods within an over-the-air model training framework, enhancing robustness by adjusting step sizes dynamically. The convergence rates of AdaGrad and Adam-like algorithms are analyzed under various system factors.
Optimizing task delegation in multi-server federated learning networks for fairness and efficiency.
FedUV promotes local models to emulate the IID setting, improving performance in non-IID scenarios.
Introducing FedComLoc, a novel approach integrating compression techniques into federated learning to reduce communication costs effectively.
FedOTP introduces efficient collaborative prompt learning strategies to address data heterogeneities in federated learning.