Sentiment analysis of online reviews is an important task in natural language processing. It has received much attention not only in academia but also in industry. Data have become an important source of competitive intelligence. Various pretraining models such as BERT and ERNIE have made great achievements in the task of natural language processing, but lack domain-specific knowledge. Knowledge graphs can enhance language representation. Furthermore, knowledge graphs have high entity / concept coverage and strong semantic expression ability. We propose a sentiment analysis knowledge graph (SAKG)-BERT model that combines sentiment analysis knowledge and the language representation model BERT. To improve the interpretability of the deep learning algorithm, we construct an SAKG in which triples are injected into sentences as domain knowledge. Our investigation reveals promising results in sentence completion and sentiment analysis tasks.
CITATION STYLE
Yan, X., Jian, F., & Sun, B. (2021). SAKG-BERT: Enabling Language Representation with Knowledge Graphs for Chinese Sentiment Analysis. IEEE Access, 9, 101695–101701. https://doi.org/10.1109/ACCESS.2021.3098180
Mendeley helps you to discover research relevant for your work.