We commonly use the experience of others when taking decisions. Reputation mechanisms aggregate in a formal way the feedback collected from peers and compute the reputation of products, services, or providers. The success of reputation mechanisms is however conditioned on obtaining true feedback. Side-payments (i.e. agents get paid for submitting feedback) can make honest reporting rational (i.e. Nash equilibrium). Unfortunately, known schemes also have other Nash equilibria that imply lying. In this paper we analyze the equilibria of two incentive-compatible reputation mechanisms and investigate how undesired equilibrium points can be eliminated by using trusted reports. © Springer-Verlag Berlin Heidelberg 2005.
CITATION STYLE
Jurca, R., & Faltings, B. (2005). Enforcing truthful strategies in incentive compatible reputation mechanisms. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3828 LNCS, pp. 268–277). https://doi.org/10.1007/11600930_26
Mendeley helps you to discover research relevant for your work.