DomBERT: Domain-oriented language model for aspect-based sentiment analysis

27Citations
Citations of this article
147Readers
Mendeley users who have this article in their library.

Abstract

This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding. We propose DomBERT, an extension of BERT to learn from both in-domain corpus and relevant domain corpora. This helps in learning domain language models with low-resources. Experiments are conducted on an assortment of tasks in aspect-based sentiment analysis (ABSA), demonstrating promising results.

Cite

CITATION STYLE

APA

Xu, H., Liu, B., Shu, L., & Yu, P. S. (2020). DomBERT: Domain-oriented language model for aspect-based sentiment analysis. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 1725–1731). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.156

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free