Grunnleggende konsepter
FEDIMPRO aims to mitigate client drift in federated learning by constructing similar conditional distributions for local training, reducing gradient dissimilarity, and enhancing generalization performance.
Statistikk
Experimental results show that FedImpro can help FL defend against data heterogeneity and enhance the generalization performance of the model.
Sitater
"We propose FedImpro to efficiently estimate feature distributions with privacy protection."
"Our main contributions include..."