Variational Fair Clustering

27Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

We propose a general variational framework of fair clustering, which integrates an original Kullback-Leibler (KL) fairness term with a large class of clustering objectives, including prototype or graph based. Fundamentally different from the existing combinatorial and spectral solutions, our variational multi-term approach enables to control the trade-off levels between the fairness and clustering objectives. We derive a general tight upper bound based on a concave-convex decomposition of our fairness term, its Lipschitz-gradient property and the Pinsker’s inequality. Our tight upper bound can be jointly optimized with various clustering objectives, while yielding a scalable solution, with convergence guarantee. Interestingly, at each iteration, it performs an independent update for each assignment variable. Therefore, it can be easily distributed for large-scale datasets. This scalability is important as it enables to explore different trade-off levels between the fairness and clustering objectives. Unlike spectral relaxation, our formulation does not require computing its eigenvalue decomposition. We report comprehensive evaluations and comparisons with state-of-the-art methods over various fair clustering benchmarks, which show that our variational formulation can yield highly competitive solutions in terms of fairness and clustering objectives.

Cite

CITATION STYLE

APA

Ziko, I. M., Yuan, J., Granger, E., & Ayed, I. B. (2021). Variational Fair Clustering. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 12B, pp. 11202–11209). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i12.17336

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free