既存のフェデレーテッドラーニングアルゴリズムが遅延したクライアントから効果的に学習する方法を改善することが重要です。
FedFisher algorithm leverages Fisher information matrices for efficient one-shot federated learning.
FEDPIT proposes a novel federated algorithm that utilizes LLMs' in-context learning capability to generate task-specific synthetic data for training autonomously, improving federated few-shot performance while preserving privacy.
Developing new algorithms to improve learning from straggler clients in federated learning.
FedCompass introduces a semi-asynchronous federated learning algorithm with a computing power-aware scheduler to address client heterogeneity and data disparities, achieving faster convergence and higher accuracy than other algorithms.
FEDPIT proposes a novel federated algorithm that leverages large language models' in-context learning capability to generate task-specific synthetic data for training autonomously, addressing challenges in privacy preservation and data scarcity in federated instruction tuning.
Incorporating momentum in FEDAVG and SCAFFOLD algorithms significantly improves convergence rates and eliminates the need for assumptions about data heterogeneity. The proposed momentum variants offer state-of-the-art performance in various client participation scenarios.