Inconsistency Matters: A Knowledge-guided Dual-inconsistency Network for Multi-modal Rumor Detection

19Citations
Citations of this article
41Readers
Mendeley users who have this article in their library.

Abstract

Rumor spreaders are increasingly utilizing multimedia content to attract the attention and trust of news consumers. Though a set of rumor detection models have exploited the multimodal data, they seldom consider the inconsistent relationships among images and texts. Moreover, they also fail to find a powerful way to spot the inconsistency information among the post contents and background knowledge. Motivated by the intuition that rumors are more likely to have inconsistency information in semantics, a novel Knowledge-guided Dualinconsistency network is proposed to detect rumors with multimedia contents. It can capture the inconsistent semantics at the cross-modal level and the content-knowledge level in one unified framework. Extensive experiments on two public real-world datasets demonstrate that our proposal can outperform the state-ofthe-art baselines.

Cite

CITATION STYLE

APA

Sun, M., Zhang, X., Ma, J., & Liu, Y. (2021). Inconsistency Matters: A Knowledge-guided Dual-inconsistency Network for Multi-modal Rumor Detection. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 1412–1423). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.122

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free