APPFL is a comprehensive and extensible federated learning framework that offers solutions to key challenges in federated learning, including heterogeneity and security, and provides user-friendly interfaces for integrating new algorithms and adapting to diverse applications.
FLEX is a flexible federated learning framework that empowers researchers to customize data distribution, privacy parameters, and communication strategies, enabling the development of novel federated learning techniques.
FedSN proposes a novel federated learning framework to address challenges in training models over LEO satellite networks.
ASYN2F is an effective asynchronous federated learning framework with bidirectional model aggregation, achieving higher performance compared to existing techniques.
새로운 합동 소스-채널 코딩 방식을 사용하여 무선 통신을 통한 연산을 가능케 하는 보편적인 연합 학습 프레임워크를 소개합니다.
FL-GUARD introduces a dynamic solution for detecting and recovering from Negative Federated Learning in real-time, outperforming previous approaches.
ASYN2F is an effective asynchronous federated learning framework that outperforms existing techniques in terms of performance and convergence speed.
FAX introduces a JAX-based library for large-scale distributed and federated computations, leveraging sharding mechanisms and automatic differentiation to simplify the expression of federated computations. The main thesis of the author is to provide a performant and scalable framework for federated computations in the data center by embedding building blocks as primitives in JAX.
The author introduces FedLoGe, a framework that enhances both local and generic model performance in the context of Federated Long-Tailed Learning by integrating representation learning and classifier alignment within a neural collapse framework.
The author introduces FL-GUARD, a dynamic solution for tackling Negative Federated Learning in run-time, ensuring recovery only when necessary. The framework outperforms previous approaches by detecting NFL quickly and efficiently.