The paper investigates the contraction properties of locally differentially private (LDP) mechanisms. The authors derive tight upper bounds on the divergence between the output distributions of an ε-LDP mechanism under different f-divergences, including KL-divergence and χ2-divergence.
The key technical results are:
Theorem 1 shows that the contraction coefficients under KL-divergence, χ2-divergence, and squared Hellinger distance are upper bounded by (eε - 1)2 / (eε + 1)2, which is tight.
Theorem 2 provides an upper bound on χ2(PK||QK) in terms of TV(P,Q) and ε, which is significantly tighter than previous bounds, especially for ε ≥ 1.
The authors then leverage these technical results to develop a systematic framework for quantifying the cost of local privacy in several statistical problems, including:
The results demonstrate that the authors' technical contributions lead to tighter privacy analyses compared to the state-of-the-art in several statistical problems.
In un'altra lingua
dal contenuto originale
arxiv.org
Approfondimenti chiave tratti da
by Shahab Asood... alle arxiv.org 05-06-2024
https://arxiv.org/pdf/2210.13386.pdfDomande più approfondite