Core Concepts
The paper introduces FedCKD, a novel personalized federated learning method that leverages comprehensive knowledge distillation from both global and historical models to enhance performance and mitigate catastrophic forgetting, striking a balance between personalization and generalization.
Wang, P., Liu, B., Guo, W., Li, Y., & Ge, S. (2024). Towards Personalized Federated Learning via Comprehensive Knowledge Distillation. arXiv preprint arXiv:2411.03569.
This paper aims to address the challenge of catastrophic forgetting in personalized federated learning (PFL) while maintaining a balance between model personalization and generalization.