提案された寄与を考慮した非同期FL手法は、受信した更新の新鮮さと統計的異質性を考慮し、グローバルモデルへの貢献を動的に調整することで、収束速度を向上させます。
FedFa, a fully asynchronous parameter update strategy for federated learning, can eliminate waiting time and guarantee convergence by merging historical model updates into the current update.
Asynchronous federated learning mechanisms can be significantly improved by modeling the queuing dynamics of the system, enabling non-uniform node sampling and better convergence guarantees.