Hierarchically clustered representation learning

7Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.

Abstract

The joint optimization of representation learning and clustering in the embedding space has experienced a breakthrough in recent years. In spite of the advance, clustering with representation learning has been limited to flat-level categories, which often involves cohesive clustering with a focus on instance relations. To overcome the limitations of flat clustering, we introduce hierarchically-clustered representation learning (HCRL), which simultaneously optimizes representation learning and hierarchical clustering in the embedding space. Compared with a few prior works, HCRL firstly attempts to consider a generation of deep embeddings from every component of the hierarchy, not just leaf components. In addition to obtaining hierarchically clustered embeddings, we can reconstruct data by the various abstraction levels, infer the intrinsic hierarchical structure, and learn the level-proportion features. We conducted evaluations with image and text domains, and our quantitative analyses showed competent likelihoods and the best accuracies compared with the baselines.

Cite

CITATION STYLE

APA

Shin, S. J., Song, K., & Moon, I. C. (2020). Hierarchically clustered representation learning. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 5776–5783). AAAI press. https://doi.org/10.1609/aaai.v34i04.6034

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free