The paper proposes an adaptive Federated Learning (FL) algorithm called FedEnt that utilizes entropy to alleviate the negative influence of heterogeneity among participating clients. The key contributions are:
FedEnt introduces an entropy term to measure the diversity among the local model parameters of all clients. This entropy term is incorporated into the objective function to adaptively adjust the learning rate for each client.
Due to the lack of communication among clients during local training, a mean-field approach is introduced to estimate the terms related to other clients' local parameters. This enables a decentralized design of the adaptive learning rate for each client.
Rigorous theoretical analysis is provided on the existence and determination of the mean-field estimators. The convergence rate of the proposed FedEnt algorithm is also proved.
Extensive experiments on real-world datasets (MNIST, EMNIST-L, CIFAR10, CIFAR100) show that FedEnt outperforms state-of-the-art FL algorithms (FedAvg, FedAdam, FedProx, FedDyn) under non-IID settings and achieves faster convergence.
To Another Language
from source content
arxiv.org
ข้อมูลเชิงลึกที่สำคัญจาก
by Shensheng Zh... ที่ arxiv.org 04-15-2024
https://arxiv.org/pdf/2303.14966.pdfสอบถามเพิ่มเติม