Core Concepts
Leveraging pre-trained generators for efficient knowledge transfer in heterogeneous federated learning.
Abstract
Introduction
Companies develop custom models due to data scarcity.
Traditional FL lacks personalization, leading to privacy concerns.
Heterogeneous Federated Learning (HtFL)
Addresses data and model heterogeneity.
Novel knowledge-sharing schemes explored beyond client models.
Proposed Scheme: FedKTL
Utilizes pre-trained generators for knowledge transfer.
Produces image-vector pairs tailored to clients' tasks.
Methodology
ETF classifier used for unbiased prototypes generation.
Domain alignment ensures valid latent vectors for image generation.
Experimental Results
FedKTL outperforms state-of-the-art methods by up to 7.31% in accuracy.
Impact Analysis
Scalable with more clients and maintains performance with increased training epochs.
Ablation Study
Removal of key components significantly impacts performance, highlighting their importance in FedKTL.
Stats
"Results show that our upload-efficient FedKTL surpasses seven state-of-the-art methods by up to 7.31% in accuracy."
"Our FedKTL can outperform seven state-of-the-art methods by at most 7.31% in accuracy."