Across two studies, we test two of Facebook’s attempts to fight misinformation: labeling misinformation as disputed or false and including fact checks as related articles. We propose hypotheses based on a two-step model of motivated reasoning, which provides insight into how misinformation is corrected. For study 1 (n = 1,262) and study 2 (n = 1,586), we created a mock Facebook News Feed consisting of five different articles—four were actual news stories and the fifth was misinformation. Both studies tested (a) the effect of misinformation without correction, (b) Facebook’s changes to its platform, and (c) an alternative we theorized could be more effective. The findings, in line with the two-step model of motivated reasoning, provide evidence of symmetric party effects for the belief in misinformation. In both studies, we find partisan differences in responses to fact checking. We find modest evidence that our improvements to Facebook’s attempts at correcting misinformation reduce misperceptions across partisan divides.
CITATION STYLE
Jennings, J., & Stroud, N. J. (2021). Asymmetric adjustment: Partisanship and correcting misinformation on Facebook. New Media and Society. https://doi.org/10.1177/14614448211021720
Mendeley helps you to discover research relevant for your work.