Domain-invariant feature distillation for cross-domain sentiment classification

17Citations
Citations of this article
130Readers
Mendeley users who have this article in their library.

Abstract

Cross-domain sentiment classification has drawn much attention in recent years. Most existing approaches focus on learning domain-invariant representations in both the source and target domains, while few of them pay attention to the domain-specific information. Despite the non-transferability of the domain-specific information, simultaneously learning domain-dependent representations can facilitate the learning of domain-invariant representations. In this paper, we focus on aspect-level cross-domain sentiment classification, and propose to distill the domain-invariant sentiment features with the help of an orthogonal domain-dependent task, i.e. aspect detection, which is built on the aspects varying widely in different domains. We conduct extensive experiments on three public datasets and the experimental results demonstrate the effectiveness of our method.

Cite

CITATION STYLE

APA

Hu, M., Wu, Y., Zhao, S., Guo, H., Cheng, R., & Su, Z. (2019). Domain-invariant feature distillation for cross-domain sentiment classification. In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 5559–5568). Association for Computational Linguistics. https://doi.org/10.18653/v1/d19-1558

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free