The Impact of Differential Privacy on Recommendation Accuracy and Popularity Bias

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Collaborative filtering-based recommender systems leverage vast amounts of behavioral user data, which poses severe privacy risks. Thus, often random noise is added to the data to ensure Differential Privacy (DP). However, to date, it is not well understood in which ways this impacts personalized recommendations. In this work, we study how DP affects recommendation accuracy and popularity bias when applied to the training data of state-of-the-art recommendation models. Our findings are three-fold: First, we observe that nearly all users’ recommendations change when DP is applied. Second, recommendation accuracy drops substantially while recommended item popularity experiences a sharp increase, suggesting that popularity bias worsens. Finally, we find that DP exacerbates popularity bias more severely for users who prefer unpopular items than for users who prefer popular items.

Cite

CITATION STYLE

APA

Müllner, P., Lex, E., Schedl, M., & Kowald, D. (2024). The Impact of Differential Privacy on Recommendation Accuracy and Popularity Bias. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 14611 LNCS, pp. 466–482). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-56066-8_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free