This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding. We propose DomBERT, an extension of BERT to learn from both in-domain corpus and relevant domain corpora. This helps in learning domain language models with low-resources. Experiments are conducted on an assortment of tasks in aspect-based sentiment analysis (ABSA), demonstrating promising results.
CITATION STYLE
Xu, H., Liu, B., Shu, L., & Yu, P. S. (2020). DomBERT: Domain-oriented language model for aspect-based sentiment analysis. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 1725–1731). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.156
Mendeley helps you to discover research relevant for your work.