The essential task of differentially private data analysis is extending the current non-private algorithms to differentially private algorithms. This extension can be realized by several frameworks, roughly categorized into Laplace/exponential frameworks and private learning frameworks. The Laplace/exponential framework incorporates Laplace or exponential mechanisms into non-private analysis algorithms directly. For example, adding Laplace noise to the count steps in the algorithm or by employing exponential mechanisms when making selections. Private learning frameworks consider data analysis as learning problems in terms of optimization. Learning problems are solved by defining a series of objective functions. Compared with the Laplace/exponential framework, a private learning framework has a clear target, and the results produced by this framework are easier to compare in terms of risk bound or sample complexity. But private learning frameworks can only deal with limited learning algorithms, while nearly all types of analysis algorithms can be implemented in a Laplace/exponential framework.
CITATION STYLE
Zhu, T., Li, G., Zhou, W., & Yu, P. S. (2017). Differentially private data analysis. In Advances in Information Security (Vol. 69, pp. 49–65). Springer New York LLC. https://doi.org/10.1007/978-3-319-62004-6_6
Mendeley helps you to discover research relevant for your work.