Design of Human-Agent-Group Interaction for Correct Opinion Sharing on Social Media

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Social media is popular for us to share some news; however, it is easy for us to receive much fake news and believe them because of its simplicity. A new model is proposed for simulating a cyber-physical system preventing fake news with humans and agents by expanding the opinion sharing model (OSM), and this paper proposes a decision-supporting system based on the model for users to avoid fake news. The experiments investigate the performance of the proposed system in social network simulation based on Twitter. The experimental results show that: (1) the proposed system performed the same simulation in the OSM’s situation. (2) The user should inform the received information to the agents straightforward for sharing correct information. (3) The proposed system enabled the agents to suggest the opponent opinion of fake news to the users when they had shared the fake news in the simulation based on Twitter posts.

Cite

CITATION STYLE

APA

Uwano, F., Yamane, D., & Takadama, K. (2022). Design of Human-Agent-Group Interaction for Correct Opinion Sharing on Social Media. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13305 LNCS, pp. 146–165). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-06424-1_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free