Efficient Nearest Neighbor Emotion Classification with BERT-whitening

14Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Retrieval-based methods have been proven effective in many NLP tasks. Previous methods use representations from the pre-trained model for similarity search directly. However, the sentence representations from the pre-trained model like BERT perform poorly in retrieving semantically similar sentences, resulting in poor performance of the retrieval-based methods. In this paper, we propose KNN-EC, a simple and efficient non-parametric emotion classification (EC) method using nearest neighbor retrieval. We use BERT-whitening to get better sentence semantics, ensuring that nearest neighbor retrieval works. Meanwhile, BERT-whitening can also reduce memory storage of datastore and accelerate retrieval speed, solving the efficiency problem of the previous methods. KNN-EC average improves the pre-trained model by 1.17 F1-macro on two emotion classification datasets.

Cite

CITATION STYLE

APA

Yin, W., & Shang, L. (2022). Efficient Nearest Neighbor Emotion Classification with BERT-whitening. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 (pp. 4738–4745). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.emnlp-main.312

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free