The author presents the ADBB method for distributed optimization over unbalanced directed networks, focusing on larger step-sizes and accelerated convergence.
This paper introduces two distributed solvers for least-squares optimization problems with differential privacy requirements. The first solver perturbs parameters using Gaussian and truncated Laplacian noises, while the second combines shuffling mechanisms and average consensus algorithms to achieve privacy-preserving solutions.
The authors introduce a comprehensive framework for analyzing distributed algorithms, proving convergence to a common estimate. They develop specific algorithms within this framework to address dynamic convex optimization problems.
Providing insights into Byzantine-resilient distributed optimization algorithms and their convergence properties.