The author presents ASYN2F, an asynchronous federated learning framework with bidirectional model aggregation, addressing the challenges of heterogeneity in training workers and improving performance through innovative aggregation algorithms.
The author proposes the FedCMD framework to address issues of heterogeneity in federated learning by introducing a novel contrastive layer selection mechanism based on Wasserstein distance.
The author introduces FL-GUARD, a dynamic solution for tackling Negative Federated Learning in run-time, ensuring recovery only when necessary. The framework outperforms previous approaches by detecting NFL quickly and efficiently.
The author introduces FedLoGe, a framework that enhances both local and generic model performance in the context of Federated Long-Tailed Learning by integrating representation learning and classifier alignment within a neural collapse framework.
FAX introduces a JAX-based library for large-scale distributed and federated computations, leveraging sharding mechanisms and automatic differentiation to simplify the expression of federated computations. The main thesis of the author is to provide a performant and scalable framework for federated computations in the data center by embedding building blocks as primitives in JAX.
ASYN2F is an effective asynchronous federated learning framework that outperforms existing techniques in terms of performance and convergence speed.
FL-GUARD introduces a dynamic solution for detecting and recovering from Negative Federated Learning in real-time, outperforming previous approaches.
새로운 합동 소스-채널 코딩 방식을 사용하여 무선 통신을 통한 연산을 가능케 하는 보편적인 연합 학습 프레임워크를 소개합니다.
ASYN2F is an effective asynchronous federated learning framework with bidirectional model aggregation, achieving higher performance compared to existing techniques.
FedSN proposes a novel federated learning framework to address challenges in training models over LEO satellite networks.