GUIDE: Group Equality Informed Individual Fairness in Graph Neural Networks

20Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

Graph Neural Networks (GNNs) are playing increasingly important roles in critical decision-making scenarios due to their exceptional performance and end-to-end design. However, concerns have been raised that GNNs could make biased decisions against underprivileged groups or individuals. To remedy this issue, researchers have proposed various fairness notions including individual fairness that gives similar predictions to similar individuals. However, existing methods in individual fairness rely on Lipschitz condition: they only optimize overall individual fairness and disregard equality of individual fairness between groups. This leads to drastically different levels of individual fairness among groups. We tackle this problem by proposing a novel GNN framework GUIDE to achieve group equality informed individual fairness in GNNs. We aim to not only achieve individual fairness but also equalize the levels of individual fairness among groups. Specifically, our framework operates on the similarity matrix of individuals to learn personalized attention to achieve individual fairness without group level disparity. Comprehensive experiments on real-world datasets demonstrate that GUIDE obtains good balance of group equality informed individual fairness and model utility. The open-source implementation of GUIDE can be found here: https://github.com/mikesong724/GUIDE.

References Powered by Scopus

Fairness through awareness

2456Citations
N/AReaders
Get full text

A Survey on Bias and Fairness in Machine Learning

2412Citations
N/AReaders
Get full text

The comparisons of data mining techniques for the predictive accuracy of probability of default of credit card clients

631Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Trustworthy Graph Neural Networks: Aspects, Methods, and Trends

30Citations
N/AReaders
Get full text

Interpreting Unfairness in Graph Neural Networks via Training Node Attribution

20Citations
N/AReaders
Get full text

Towards Fair Financial Services for All: A Temporal GNN Approach for Individual Fairness on Transaction Networks

12Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Song, W., Dong, Y., Liu, N., & Li, J. (2022). GUIDE: Group Equality Informed Individual Fairness in Graph Neural Networks. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 1625–1634). Association for Computing Machinery. https://doi.org/10.1145/3534678.3539346

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 6

100%

Readers' Discipline

Tooltip

Computer Science 4

100%

Save time finding and organizing research with Mendeley

Sign up for free