Sign In

Optimal Communication Protocols for Approximating Sums of Entrywise Functions in the Coordinator Model and Beyond

Core Concepts
The core message of this paper is to provide optimal communication protocols for approximating sums of entrywise functions in the coordinator model, as well as efficient algorithms for solving linear algebra problems in more general network topologies using the personalized CONGEST model.
The paper addresses the problem of approximating the sum Σi f(xi) in the coordinator model, where each server j holds a non-negative vector x(j) and the goal is to approximate the sum Σi f(xi) up to a 1±ε factor, for a given non-negative function f. The key highlights and insights are: The authors introduce a new parameter cf[s] that captures the communication complexity of approximating Σi f(xi) more accurately than the previously used parameter cf,s. For functions f that satisfy an "approximate invertibility" property, the authors provide a two-round protocol that uses Oθ,θ',θ''(cf[s]/ε^2) bits of communication to approximate Σi f(xi) up to a 1±ε factor. For the special case of f(x) = x^k, the authors show that cf[s] = s^(k-1) and their protocol matches the known lower bounds, resolving the open question of the optimal communication complexity of Fk-moment estimation in the coordinator model. The authors also show that any one-round algorithm for Fk-moment estimation must use Ω(s^(k-1)/ε^k) bits of communication, demonstrating the optimality of their two-round protocol. Beyond the coordinator model, the authors study the personalized CONGEST model and provide efficient algorithms for computing ℓp-subspace embeddings, solving ℓp-regression, and low-rank approximation, where the communication per node in each round is polylogarithmic in the relevant parameters.
Σi f(xi) = Σi (Σj xi(j)) cf[s] = smallest number such that f(Σj yj) ≤ cf[s] * (Σj √f(yj))^2 for all yj ≥ 0 Ω(s^(k-1)/ε^2) lower bound for Fk-moment estimation
"For this broad class of functions, our result improves upon the communication bounds achieved by Kannan, Vempala, and Woodruff (COLT 2014) and Woodruff and Zhang (STOC 2012), obtaining the optimal communication up to polylogarithmic factors in the minimum number of rounds." "We show that our protocol can also be used for approximating higher-order correlations." "Our sketch construction may be of independent interest and can implement any importance sampling procedure that has a monotonicity property."

Deeper Inquiries

How can the techniques developed in this paper be extended to other classes of functions beyond the "approximate invertibility" property

The techniques developed in the paper for approximating functions with the "approximate invertibility" property can be extended to other classes of functions by considering different properties that the functions may exhibit. For example, functions that are submodular or supermodular could be targeted for approximation using similar correlated randomness and composable sketching techniques. By analyzing the specific properties of these functions, protocols can be designed to efficiently approximate their values in distributed settings. Additionally, functions with specific structural properties, such as convexity or concavity, could also be targeted for approximation using similar methodologies.

What are the implications of the personalized CONGEST model and the composable sketching techniques for other distributed optimization and machine learning problems

The personalized CONGEST model and composable sketching techniques have significant implications for other distributed optimization and machine learning problems. In the personalized CONGEST model, the ability to compute subspace embeddings, solve linear regression, and approximate low-rank matrices efficiently can be extended to various optimization tasks in distributed settings. These techniques can be applied to problems such as collaborative filtering, clustering, and dimensionality reduction in distributed machine learning scenarios. The composable sketching techniques can also be leveraged for sensitivity sampling and importance sampling procedures in various optimization tasks, providing a framework for efficient distributed computation.

Can the ideas of correlated randomness and composable sketches be applied to develop efficient distributed algorithms for other fundamental problems in graph theory and combinatorial optimization

The ideas of correlated randomness and composable sketches can be applied to develop efficient distributed algorithms for a wide range of fundamental problems in graph theory and combinatorial optimization. For example, these techniques can be used to solve problems such as graph coloring, maximum flow, minimum cut, and shortest path algorithms in distributed settings. By leveraging correlated randomness and composable sketches, algorithms can be designed to optimize communication complexity and computation efficiency in distributed graph algorithms. Additionally, these techniques can be extended to problems in network analysis, social network modeling, and network flow optimization in distributed systems.