A First Look into Privacy Leakage in 3D Mixed Reality Data

5Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We have seen a rise in mixed (MR) and augmented reality (AR) applications and devices in recent years. Subsequently, we have become familiar with the sensing power of these applications and devices, and we are only starting to realize the nascent risks that these technology puts over our privacy and security. Current privacy protection measures are primarily aimed towards known and well-utilised data types (i.e. location, on-line activity, biometric, and so on) while a few works have focused on looking into the security and privacy risks of and providing protection on MR data, particularly on 3D MR data. In this work, we primarily reveal the privacy leakage from released 3D MR data and how the leakage persist even after implementing spatial generalizations and abstractions. Firstly, we formalize the spatial privacy problem in 3D mixed reality data as well as the adversary model. Then, we demonstrate through an inference model how adversaries can identify 3D spaces and, potentially, infer more spatial information. Moreover, we also demonstrate how compact 3D MR Data can be in terms of memory usage which allows adversaries to create lightweight 3D inference models of user spaces.

Cite

CITATION STYLE

APA

de Guzman, J. A., Thilakarathna, K., & Seneviratne, A. (2019). A First Look into Privacy Leakage in 3D Mixed Reality Data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11735 LNCS, pp. 149–169). Springer. https://doi.org/10.1007/978-3-030-29959-0_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free