Asymmetric adjustment: Partisanship and correcting misinformation on Facebook

Citations of this article
Mendeley users who have this article in their library.
Get full text


Across two studies, we test two of Facebook’s attempts to fight misinformation: labeling misinformation as disputed or false and including fact checks as related articles. We propose hypotheses based on a two-step model of motivated reasoning, which provides insight into how misinformation is corrected. For study 1 (n = 1,262) and study 2 (n = 1,586), we created a mock Facebook News Feed consisting of five different articles—four were actual news stories and the fifth was misinformation. Both studies tested (a) the effect of misinformation without correction, (b) Facebook’s changes to its platform, and (c) an alternative we theorized could be more effective. The findings, in line with the two-step model of motivated reasoning, provide evidence of symmetric party effects for the belief in misinformation. In both studies, we find partisan differences in responses to fact checking. We find modest evidence that our improvements to Facebook’s attempts at correcting misinformation reduce misperceptions across partisan divides.




Jennings, J., & Stroud, N. J. (2021). Asymmetric adjustment: Partisanship and correcting misinformation on Facebook. New Media and Society.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free