Linking differential identifiability with differential privacy

6Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The problem of preserving privacy while mining data has been studied extensively in recent years because of its importance for enabling sharing data sets. Differential Identifiability, parameterized by the probability of individual identification ρ, was proposed to provide a solution to this problem. Our study of the proposed Differential Identifiability model shows that: First, its usability is based on a very strong requirement. That is, the prior probability of an individual being present in a database is the same for all individuals. Second, there is no formal link between the proposed model and well known privacy models such as Differential Privacy. This paper presents a new differential identifiability model for preventing the disclosure of the presence of an individual in a database while considering an adversary with arbitrary prior knowledge about each individual. We show that the general Laplace noise addition mechanism can be used to satisfy our new differential identifiability definition and that there is a direct link between differential privacy and our proposed model. The evaluation of our model shows that it provides a good privacy/utility trade-off for most aggregate queries.

Cite

CITATION STYLE

APA

Bkakria, A., Cuppens-Boulahia, N., & Cuppens, F. (2018). Linking differential identifiability with differential privacy. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11149 LNCS, pp. 232–247). Springer Verlag. https://doi.org/10.1007/978-3-030-01950-1_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free