核心概念
Proposing an algorithm for privacy-preserving distributed nonconvex stochastic optimization with quantized communications.
要約
This paper introduces a novel algorithm for distributed nonconvex stochastic optimization with differential privacy. It addresses privacy protection, communication efficiency, and convergence simultaneously. The algorithm adds privacy noises and quantizes information to enhance privacy levels and reduce the impact of privacy noises. The convergence rate and differential privacy level are analyzed, showing improvements over existing methods. The paper provides a comprehensive numerical example to demonstrate the effectiveness of the proposed algorithm.
統計
Each node adds time-varying privacy noises to its local state.
The proposed algorithm achieves mean convergence and a finite cumulative differential privacy budget over infinite iterations.
Employing the subsampling method controlled through the sample-size parameter.
引用
"The proposed algorithm achieves both the mean convergence and a finite cumulative differential privacy budget over infinite iterations."
"Compared with existing works, the differential privacy level is enhanced, and a finite cumulative differential privacy budget is achieved over infinite iterations."