Stereotype-aware collaborative filtering

3Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

In collaborative filtering, recommendations are made using user feedback on a few products. In this paper, we show that even if sensitive attributes are not used to fit the models, a disparate impact may nevertheless affect recommendations. We propose a definition of fairness for the recommender system that expresses that the ranking of items should be independent of sensitive attribute. We design a co- clustering of users and items that processes exogenous sensitive attributes to remove their influence to return fair recommendations. We prove that our model ensures approximately fair recommendations provided that the classification of users approximately respects statistical parity.

Cite

CITATION STYLE

APA

Frisch, G., Leger, J. B., & Grandvalet, Y. (2021). Stereotype-aware collaborative filtering. In Proceedings of the 16th Conference on Computer Science and Intelligence Systems, FedCSIS 2021 (pp. 69–79). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.15439/2021F117

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free