Core Concepts
Accurately estimating the score functions of Gaussian pancakes distributions, a class of computationally hard distributions, is computationally intractable even with polynomial sample complexity.
Abstract
The content discusses the computational hardness of accurately estimating the score functions of Gaussian pancakes distributions, a class of distributions that are computationally indistinguishable from the standard Gaussian under widely believed hardness assumptions.
Key highlights:
Gaussian pancakes distributions are "backdoored" Gaussians that are distributed as a (noisy) discrete Gaussian along a secret direction and as a standard Gaussian in the remaining directions.
Previous works have shown that the problem of distinguishing Gaussian pancakes from the standard Gaussian is computationally hard, with implications for lattice-based cryptography.
The author shows that computationally efficient L2-accurate score estimation for Gaussian pancakes distributions implies an efficient algorithm for distinguishing them from the standard Gaussian.
This establishes a statistical-to-computational gap for L2-accurate score estimation, meaning that what is statistically achievable may not be computationally feasible without stronger assumptions on the data distribution.
The author provides a reduction from the Gaussian pancakes problem to L2-accurate score estimation, demonstrating that score estimation for Gaussian pancakes is as hard as the Gaussian pancakes problem itself.
The hardness of score estimation arises solely from the hardness of learning, as the score functions can be efficiently approximated by common function classes used in practice.