Toward Transformer Fusions for Chinese Sentiment Intensity Prediction in Valence-Arousal Dimensions

8Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

BERT (Bidirectional Encoder Representations from Transformers) uses an encoder architecture with an attention mechanism to construct a transformer-based neural network. In this study, we develop a Chinese word-level BERT to learn contextual language representations and propose a transformer fusion framework for Chinese sentiment intensity prediction in the valence-arousal dimensions. Experimental results on the Chinese EmoBank indicate that our transformer-based fusion model outperforms other neural-network-based, regression-based and lexicon-based methods, reflecting the effectiveness of integrating semantic representations in different degrees of linguistic granularity. Our proposed transformer fusion framework is also simple and easy to fine-tune over different downstream tasks.

Cite

CITATION STYLE

APA

Deng, Y. C., Wang, Y. R., Chen, S. H., & Lee, L. H. (2023). Toward Transformer Fusions for Chinese Sentiment Intensity Prediction in Valence-Arousal Dimensions. IEEE Access, 11, 109974–109982. https://doi.org/10.1109/ACCESS.2023.3322436

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free