The paper proposes a novel algorithm called Differentially private Neural Augmentation (DNA) for statistical contact tracing. The key contributions are:
Augmenting the statistical inference in contact tracing with a learned neural network, while ensuring the neural augmentation satisfies differential privacy. This bridges the promising field of neural augmentation with differential privacy.
Identifying a novel hierarchy of privacy in contact tracing, providing a theoretical proof of differential privacy at the most general level of the hierarchy.
Evaluating the methods on a widely used simulator, showing that even in challenging situations like noisy tests or agents not following the protocol, the DNA method significantly reduces the peak infection rate compared to prior differentially private methods.
The statistical model for contact tracing is based on a Markov chain of disease states (Susceptible, Exposed, Infected, Recovered) and noisy observations from COVID-19 tests. The authors augment the statistical inference with a neural network that learns patterns not captured by the statistical model, while bounding the sensitivity of the neural network to ensure differential privacy.
The analysis identifies three levels of privacy composition in contact tracing - per message, per day, and across multiple days. The authors prove that the sensitivity of the Factorized Neighbors (FN) algorithm is bounded, which motivates the neural augmentation module to have a similar bound.
Experiments on a COVID-19 simulator show that at the crucial setting of ε=1 differential privacy, the DNA method achieves significantly lower peak infection rates compared to prior differentially private and non-private methods. The method also demonstrates robustness to challenges like noisy tests and non-compliant agents.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Rob Romijnde... at arxiv.org 04-23-2024
https://arxiv.org/pdf/2404.13381.pdfDeeper Inquiries