Abstract
Normalizing flow models have risen as a popular solution to the problem of density estimation, enabling high-quality synthetic data generation as well as exact probability density evaluation. However, in contexts where individuals are directly associated with the training data, releasing such a model raises privacy concerns. In this work, we propose the use of normalizing flow models that provide explicit differential privacy guarantees as a novel approach to the problem of privacy-preserving density estimation. We evaluate the efficacy of our approach empirically using benchmark datasets, and we demonstrate that our method substantially outperforms previous state-of-the-art approaches. We additionally show how our algorithm can be applied to the task of differentially private anomaly detection.
Author supplied keywords
Cite
CITATION STYLE
Waites, C., & Cummings, R. (2021). Differentially Private Normalizing Flows for Privacy-Preserving Density Estimation. In AIES 2021 - Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society (pp. 1000–1009). Association for Computing Machinery, Inc. https://doi.org/10.1145/3461702.3462625
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.