Fusing Label Embedding into BERT: An Efficient Improvement for Text Classification

45Citations
Citations of this article
79Readers
Mendeley users who have this article in their library.

Abstract

With pre-trained models, such as BERT, gaining more and more attention, plenty of research has been done to further promote their capabilities, from enhancing the experimental procedures (Sun et al., 2019) to improving the mathematical principles. In this paper, we propose a concise method for improving BERT's performance in text classification by utilizing a label embedding technique while keeping almost the same computational cost. Experimental results on six text classification benchmark datasets demonstrate its effectiveness.

Cite

CITATION STYLE

APA

Xiong, Y., Feng, Y., Wu, H., Kamigaito, H., & Okumura, M. (2021). Fusing Label Embedding into BERT: An Efficient Improvement for Text Classification. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 1743–1750). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.152

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free