Reputation is a central element of social communications, be it with human or artificial intelligence (AI), and as such can be the primary target of malicious communication strategies. There is already a vast amount of literature on trust networks and their dynamics using Bayesian principles and involving Theory of Mind models. An issue for these simulations can be the amount of information that can be stored and managed using discretizing variables and hard thresholds. Here a novel approach to the way information is updated that accounts for knowledge uncertainty and is closer to reality is proposed. Agents use information compression techniques to capture their complex environment and store it in their finite memories. The loss of information that results from this leads to emergent phenomena, such as echo chambers, self-deception, deception symbiosis, and freezing of group opinions. Various malicious strategies of agents are studied for their impact on group sociology, like sycophancy, egocentricity, pathological lying, and aggressiveness. Our set-up already provides insights into social interactions and can be used to investigate the effects of various communication strategies and find ways to counteract malicious ones. Eventually this work should help to safeguard the design of non-abusive AI systems.
CITATION STYLE
Enßlin, T., Kainz, V., & Bœhm, C. (2022). A Reputation Game Simulation: Emergent Social Phenomena from Information Theory. Annalen Der Physik, 534(5). https://doi.org/10.1002/andp.202100277
Mendeley helps you to discover research relevant for your work.