Keskeiset käsitteet
The authors propose two communication-efficient decentralized optimization algorithms, Compressed Push-Pull (CPP) and Broadcast-like CPP (B-CPP), that achieve linear convergence for minimizing strongly convex and smooth objective functions over general directed networks.
Tiivistelmä
The paper proposes two decentralized optimization algorithms that combine gradient tracking with communication compression to solve the problem of minimizing the average of local objective functions over a general directed network.
The first algorithm, Compressed Push-Pull (CPP), uses a general class of unbiased compression operators and achieves linear convergence for strongly convex and smooth objective functions. The second algorithm, Broadcast-like CPP (B-CPP), is a broadcast-like version of CPP that further reduces communication costs and also enjoys linear convergence under the same conditions.
The key highlights and insights are:
- CPP combines the gradient tracking Push-Pull method with communication compression, allowing it to work under a general class of unbiased compression operators and achieve linear convergence over directed graphs.
- B-CPP is a broadcast-like version of CPP that can be applied in an asynchronous setting and further reduces communication costs compared to CPP.
- The theoretical analysis shows that both CPP and B-CPP achieve linear convergence for minimizing strongly convex and smooth objective functions over general directed networks.
- Numerical experiments demonstrate the advantages of the proposed methods in terms of communication efficiency.
Tilastot
The paper does not contain any explicit numerical data or statistics to support the key claims. The analysis is primarily theoretical, focusing on establishing the linear convergence properties of the proposed algorithms.