This study presents two novel distributed solvers for least-squares optimization problems with differential privacy constraints. The first solver utilizes Gaussian and truncated Laplacian noises to perturb parameters, while the second solver employs shuffling mechanisms and average consensus algorithms. Both approaches aim to preserve privacy while ensuring computation accuracy, making them suitable for large-scale distributed optimization tasks.
The paper discusses the challenges of preserving privacy in distributed optimization scenarios where local cost functions may contain sensitive data. It highlights the importance of differential privacy in protecting information during communication processes among network agents.
Two distinct solvers are proposed: the DP-GT-based solver and the DP-DiShuf-AC-based Solver. The former focuses on perturbing gradient tracking algorithms, showcasing a trade-off between privacy levels and computation accuracy dependent on network size. In contrast, the latter leverages shuffling mechanisms to achieve differential privacy guarantees independently of network size.
Numerical simulations demonstrate the effectiveness of both solvers in solving least-squares optimization problems while preserving privacy. The results indicate that the DP-DiShuf-AC-based solver offers better computation accuracy under varying levels of differential privacy requirements.
The study concludes by comparing the performance of both solvers across different network sizes, highlighting their resilience to higher levels of privacy preservation and superior computation accuracy compared to traditional methods.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Weijia Liu,L... at arxiv.org 03-05-2024
https://arxiv.org/pdf/2403.01435.pdfDeeper Inquiries