Tight Bounds on the Contraction of Locally Differentially Private Mechanisms
The authors derive tight upper bounds on the divergence between the output distributions of an ε-LDP mechanism under different f-divergences, including KL-divergence and χ2-divergence. These bounds are used to establish locally private versions of powerful information-theoretic tools for bounding minimax estimation risks.