Proposing Adaptive Coded Federated Learning (ACFL) to optimize privacy and learning performance in the presence of stragglers.
Decentralised federated learning's effectiveness is influenced by network topology, requiring an improved initialisation strategy for artificial neural networks.
Effizientes Ressourcenmanagement und Datenschutz in Federated Learning.
Federated Local and Generic Model Training in Fed-LT (FedLoGe) enhances both local and generic model performance through representation learning and classifier alignment within a neural collapse framework.
FEDPIT proposes a novel federated algorithm that leverages large language models' in-context learning capability to generate task-specific synthetic data for training autonomously, improving federated few-shot performance while preserving privacy.
FEDPIT nutzt LLMs zur Generierung synthetischer Daten für verbesserte Leistung und Datenschutz in der föderierten Anweisungsabstimmung.
PMFL integrates federated learning and meta-learning to efficiently handle heterogeneous medical datasets, achieving superior performance.
This research paper proposes a novel one-shot clustering algorithm for Multi-Task Hierarchical Federated Learning (MT-HFL) that leverages data similarity among users to improve accuracy and efficiency while preserving data privacy.
データの異質性が極めて高い場合でも、パーソナライズされたウォームアップフェーズを導入することで、連合学習の収束速度と最終的な精度を向上させることができる。
HiCS-FL, a novel client selection method for federated learning, accelerates model training convergence and reduces variance in non-IID settings by estimating and leveraging client data heterogeneity during the sampling process.