Differential privacy for eye-tracking data

41Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.

Abstract

As large eye-tracking datasets are created, data privacy is a pressing concern for the eye-tracking community. De-identifying data does not guarantee privacy because multiple datasets can be linked for inferences. A common belief is that aggregating individuals’ data into composite representations such as heatmaps protects the individual. However, we analytically examine the privacy of (noise-free) heatmaps and show that they do not guarantee privacy. We further propose two noise mechanisms that guarantee privacy and analyze their privacy-utility tradeoff. Analysis reveals that our Gaussian noise mechanism is an elegant solution to preserve privacy for heatmaps. Our results have implications for interdisciplinary research to create differentially private mechanisms for eye tracking.

Cite

CITATION STYLE

APA

Liu, A., Xia, L., Duchowski, A., Bailey, R., Holmqvist, K., & Jain, E. (2019). Differential privacy for eye-tracking data. In Eye Tracking Research and Applications Symposium (ETRA). Association for Computing Machinery. https://doi.org/10.1145/3314111.3319823

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free