AdaGossip, a novel decentralized learning method, dynamically adjusts the consensus step-size based on the compressed model differences between neighboring agents to improve performance under constrained communication.
Averaging uncorrelated neural network models in gossip learning systems can lead to a "vanishing variance" problem, causing significant convergence delays. A variance-corrected model averaging algorithm is proposed to eliminate this issue, enabling gossip learning to achieve convergence efficiency comparable to federated learning.
This paper presents a decentralized collaborative learning framework that incorporates deep variational autoencoders (VAEs) for enhanced anomaly detection, while providing a theoretical analysis on external data privacy leakage when the trained models are shared.
MAPL jointly learns heterogeneous personalized models and a collaboration graph in a decentralized peer-to-peer setting, without relying on a central server.