Two-Sided Fairness in Non-Personalised Recommendations (Student Abstract)

7Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

Recommender systems are one of the most widely used services on several online platforms to suggest potential items to the end-users. These services often use different machine learning techniques for which fairness is a concerning factor, especially when the downstream services have the ability to cause social ramifications. Thus, focusing on the non-personalised (global) recommendations in news media platforms (e.g., top-k trending topics on Twitter, top-k news on a news platform, etc.), we discuss on two specific fairness concerns together (traditionally studied separately)-user fairness (Chakraborty et al. 2019) and organisational fairness (Burke et al. 2020). While user fairness captures the idea of representing the choices of all the individual users in the case of global recommendations, organisational fairness tries to ensure politically/ideologically balanced recommendation sets. This makes user fairness a user-side requirement and organisational fairness a platform-side requirement. For user fairness, we test with methods from social choice theory, i.e., various voting rules known to better represent user choices in their results. Even in our application of voting rules to the recommendation setup, we observe high user satisfaction scores (table 2). Now for organisational fairness, we propose a bias metric (eq-1) which measures the aggregate ideological bias of a recommended set of items (articles). Analysing the results obtained from voting rule-based recommendation, we find that while the well-known voting rules are better from the user side, they show high bias values and clearly not suitable for organisational requirements of the platforms. Thus, there is a need to build an encompassing mechanism by cohesively bridging ideas of user fairness and organisational fairness. In this abstract paper, we intend to frame the elementary ideas along with the clear motivation behind the requirement of such a mechanism.

Cite

CITATION STYLE

APA

Mondal, A. S., Bal, R., Sinha, S., & Patro, G. K. (2021). Two-Sided Fairness in Non-Personalised Recommendations (Student Abstract). In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 18, pp. 15851–15852). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i18.17922

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free