Abstract
Multi-modal sentiment analysis (MSA) is increasingly becoming a hotspot because it extends the conventional Sentiment analysis (SA) based on texts to multi-modal content which can provide richer affective information. However, compared with text-based sentiment analysis, multi-modal sentiment analysis has much more challenges, because the joint learning process on multi-modal data requires both fine-grained semantic matching and effective heterogeneous feature fusion. Existing approaches generally infer sentiment type from splicing features extracted from different modalities but neglect the strong semantic correlation among co-occurrence data of different modalities. To solve the challenges, a multi-level deep correlative network for multimodal sentiment analysis is proposed, which can reduce the semantic gap by analyzing simultaneously the middle-level semantic features of images and the hierarchical deep correlations. First, the most relevant cross-modal feature representation is generated with Multi-modal Deep and discriminative correlation analysis (Multi-DDCA) while keeping those respective modal feature representations to be discriminative. Second, the high-level semantic outputs from multi-modal deep and discriminative correlation analysis are encoded into attention-correlation cross-modal feature representation through a co-attention-based multimodal correlation submodel, and then they are further merged by multi-layer neural network to train a sentiment classifier for predicting sentimental categories. Extensive experimental results on five datasets demonstrate the effectiveness of the designed approach, which outperforms several state-of-the-art fusion strategies for sentiment analysis.
Author supplied keywords
Cite
CITATION STYLE
Cai, G., Lyu, G., Lin, Y., & Wen, Y. (2020). Multi-level deep correlative networks for multi-modal sentiment analysis. Chinese Journal of Electronics, 29(6), 1025–1038. https://doi.org/10.1049/cje.2020.09.003
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.