This paper introduces a novel algorithm for distributed nonconvex stochastic optimization with differential privacy. It addresses privacy protection, communication efficiency, and convergence simultaneously. The algorithm adds privacy noises and quantizes information to enhance privacy levels and reduce the impact of privacy noises. The convergence rate and differential privacy level are analyzed, showing improvements over existing methods. The paper provides a comprehensive numerical example to demonstrate the effectiveness of the proposed algorithm.
Vers une autre langue
à partir du contenu source
arxiv.org
Questions plus approfondies