The paper presents the Federated Memory Strengthening (FedMeS) framework to address the challenges of client drift and catastrophic forgetting in personalized federated continual learning (PFCL) problems.
In the training process, FedMeS utilizes a small amount of local memory at each client to store samples from previous tasks. This local memory is used to:
In the inference process, FedMeS leverages the local memory to perform KNN-based Gaussian inference, further strengthening the model's personalization capability. Importantly, FedMeS is designed to be task-oblivious, where the same inference process is applied to samples from all tasks.
The paper provides theoretical analysis on the convergence of FedMeS and extensive experimental evaluations on various datasets and settings. FedMeS is shown to outperform state-of-the-art baselines in terms of average accuracy and forgetting rate across all experiments. The results demonstrate the effectiveness of FedMeS in addressing client drift and catastrophic forgetting in PFCL problems.
toiselle kielelle
lähdeaineistosta
arxiv.org
Tärkeimmät oivallukset
by Jin Xie,Chen... klo arxiv.org 04-22-2024
https://arxiv.org/pdf/2404.12710.pdfSyvällisempiä Kysymyksiä