The paper introduces a new set of mathematical objects called "tensor cumulants" that are useful for analyzing low-degree polynomial algorithms for statistical inference problems over invariant tensor distributions. The key insights are:
Tensor networks and graph moments provide a natural class of invariant polynomials that can be used as the building blocks for low-degree algorithms.
The tensor cumulants give an explicit, near-orthogonal basis for this space of invariant polynomials. They generalize the free cumulants from free probability theory on matrices to the tensor setting.
Using the tensor cumulants, the paper provides new results for two concrete problems:
a. Tensor PCA: The paper unifies and strengthens previous results on the computational hardness of tensor PCA below a critical signal-to-noise ratio, and gives a new tight analysis of reconstruction with low-degree polynomials.
b. Distinguishing Wigner from Wishart tensors: The paper establishes a sharp computational threshold for this problem, providing evidence of a new statistical-computational gap in the central limit theorem for random tensors.
More broadly, the tensor cumulants are shown to be valuable mathematical objects that generalize key properties of free probability theory from matrices to tensors.
他の言語に翻訳
原文コンテンツから
arxiv.org
深掘り質問