Informational Friction as a Lens for Studying Algorithmic Aspects of Privacy

8Citations
Citations of this article
76Readers
Mendeley users who have this article in their library.

Abstract

This paper addresses challenges in conceptualizing privacy posed by algorithmic systems that can infer sensitive information from seemingly innocuous data. This type of privacy is of imminent concern due to the rapid adoption of machine learning and artificial intelligence systems in virtually every industry. In this paper, we suggest informational friction, a concept from Floridi's ethics of information, as a valuable conceptual lens for studying algorithmic aspects of privacy. Informational friction describes the amount of work required for one agent to access or alter the information of another. By focusing on amount of work, rather than the type of information or manner in which it is collected, informational friction can help to explain why automated analyses should raise privacy concerns independently of, and in addition to, those associated with data collection. As a demonstration, this paper analyze law enforcement use of facial recognition, andFacebook's targeted advertising model using informational friction and demonstrate risks inherent to these systems which are not completely identified in another popular framework, Nissenbaum's Contextual Integrity.The paper concludes with a discussion of broader implications, both for privacy research and for privacy regulation.

Cite

CITATION STYLE

APA

Skeba, P., & Baumer, E. P. S. (2020). Informational Friction as a Lens for Studying Algorithmic Aspects of Privacy. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW2). https://doi.org/10.1145/3415172

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free