Integrating BERT Embeddings and BiLSTM for Emotion Analysis of Dialogue

  • Gou Z
  • Li Y
N/ACitations
Citations of this article
17Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Dialogue system is an important application of natural language processing in human‐computer interaction. Emotion analysis of dialogue aims to classify the emotion of each utterance in dialogue, which is crucially important to dialogue system. In dialogue system, emotion analysis is helpful to the semantic understanding and response generation and is great significance to the practical application of customer service quality inspection, intelligent customer service system, chatbots, and so on. However, it is challenging to solve the problems of short text, synonyms, neologisms, and reversed word order for emotion analysis in dialogue. In this paper, we analyze that the feature modeling of different dimensions of dialogue utterances is helpful to achieve more accurate sentiment analysis. Based on this, we propose the BERT (bidirectional encoder representation from transformers) model that is used to generate word‐level and sentence‐level vectors, and then, word‐level vectors are combined with BiLSTM (bidirectional long short‐term memory) that can better capture bidirectional semantic dependencies, and word‐level and sentence‐level vectors are connected and inputted to linear layer to determine emotions in dialogue. The experimental results on two real dialogue datasets show that the proposed method significantly outperforms the baselines.

Cite

CITATION STYLE

APA

Gou, Z., & Li, Y. (2023). Integrating BERT Embeddings and BiLSTM for Emotion Analysis of Dialogue. Computational Intelligence and Neuroscience, 2023(1). https://doi.org/10.1155/2023/6618452

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free