Contrastive Hierarchical Clustering

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep clustering has been dominated by flat models, which split a dataset into a predefined number of groups. Although recent methods achieve an extremely high similarity with the ground truth on popular benchmarks, the information contained in the flat partition is limited. In this paper, we introduce CoHiClust, a Contrastive Hierarchical Clustering model based on deep neural networks, which can be applied to typical image data. By employing a self-supervised learning approach, CoHiClust distills the base network into a binary tree without access to any labeled data. The hierarchical clustering structure can be used to analyze the relationship between clusters, as well as to measure the similarity between data points. Experiments demonstrate that CoHiClust generates a reasonable structure of clusters, which is consistent with our intuition and image semantics. Moreover, it obtains superior clustering accuracy on most of the image datasets compared to the state-of-the-art flat clustering models. Our implementation is available at https://github.com/MichalZnalezniak/Contrastive-Hierarchical-Clustering.

Cite

CITATION STYLE

APA

Znalezniak, M., Rola, P., Kaszuba, P., Tabor, J., & Śmieja, M. (2023). Contrastive Hierarchical Clustering. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 14169 LNAI, pp. 627–643). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-43412-9_37

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free