Incorporating Hierarchy into Text Encoder: a Contrastive Learning Approach for Hierarchical Text Classification

108Citations
Citations of this article
113Readers
Mendeley users who have this article in their library.

Abstract

Hierarchical text classification is a challenging subtask of multi-label classification due to its complex label hierarchy. Existing methods encode text and label hierarchy separately and mix their representations for classification, where the hierarchy remains unchanged for all input text. Instead of modeling them separately, in this work, we propose Hierarchy-guided Contrastive Learning (HGCLR) to directly embed the hierarchy into a text encoder. During training, HGCLR constructs positive samples for input text under the guidance of the label hierarchy. By pulling together the input text and its positive sample, the text encoder can learn to generate the hierarchy-aware text representation independently. Therefore, after training, the HGCLR enhanced text encoder can dispense with the redundant hierarchy. Extensive experiments on three benchmark datasets verify the effectiveness of HGCLR.

Cite

CITATION STYLE

APA

Wang, Z., Wang, P., Huang, L., Sun, X., & Wang, H. (2022). Incorporating Hierarchy into Text Encoder: a Contrastive Learning Approach for Hierarchical Text Classification. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 7109–7119). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.491

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free