The authors propose a framework for training Generative Adversarial Networks (GANs) on differentially privatized data using entropic optimal transport, enabling the generator to learn the raw data distribution even with access to privatized samples.
Entropic regularization of optimal transport enables learning from privatized data while mitigating noise effects and curse of dimensionality.