There has been growing attention on fairness considerations recently, especially in the context of intelligent decision making systems. For example, explainable recommendation systems may suffer from both explanation bias and performance disparity. We show that inactive users may be more susceptible to receiving unsatisfactory recommendations due to their insufficient training data, and that their recommendations may be biased by the training records of active users due to the nature of collaborative filtering, which leads to unfair treatment by the system. In this paper, we analyze different groups of users according to their level of activity, and find that bias exists in recommendation performance between different groups. Empirically, we find that such performance gap is caused by the disparity of data distribution, specifically the knowledge graph path distribution in this work. We propose a fairness constrained approach via heuristic re-ranking to mitigate this unfairness problem in the context of explainable recommendation over knowledge graphs. We experiment on several real-world datasets with state-of-the-art knowledge graph-based explainable recommendation algorithms. The promising results show that our algorithm is not only able to provide high-quality explainable recommendations, but also reduces the recommendation unfairness in several aspects.
CITATION STYLE
Fu, Z., Xian, Y., Gao, R., Zhao, J., Huang, Q., Ge, Y., … De Melo, G. (2020). Fairness-Aware Explainable Recommendation over Knowledge Graphs. In SIGIR 2020 - Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 69–78). Association for Computing Machinery, Inc. https://doi.org/10.1145/3397271.3401051
Mendeley helps you to discover research relevant for your work.