Abstract
BERT (Bidirectional Encoder Representations from Transformers) uses an encoder architecture with an attention mechanism to construct a transformer-based neural network. In this study, we develop a Chinese word-level BERT to learn contextual language representations and propose a transformer fusion framework for Chinese sentiment intensity prediction in the valence-arousal dimensions. Experimental results on the Chinese EmoBank indicate that our transformer-based fusion model outperforms other neural-network-based, regression-based and lexicon-based methods, reflecting the effectiveness of integrating semantic representations in different degrees of linguistic granularity. Our proposed transformer fusion framework is also simple and easy to fine-tune over different downstream tasks.
Author supplied keywords
Cite
CITATION STYLE
Deng, Y. C., Wang, Y. R., Chen, S. H., & Lee, L. H. (2023). Toward Transformer Fusions for Chinese Sentiment Intensity Prediction in Valence-Arousal Dimensions. IEEE Access, 11, 109974–109982. https://doi.org/10.1109/ACCESS.2023.3322436
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.