The tension between the increasing need for fact-checking and the limited capacity of fact-check providers inspired several crowdsourced approaches to address this challenge. However, little is known about how effectively crowdsourced fact-checking might perform in the real world at a large scale. We fill this gap by evaluating a Taiwanese crowdsourced fact-checking community and two professional fact-checking sites from four dimensions: Variety, Velocity, Veracity, and Viability. Our analysis shows the different focus these two types of sites have in terms of topic coverage (variety) and demonstrates that while crowdsourced fact-checkers are much faster than professionals (velocity) to answer new requests, these fact-checkers often build on the existing professional knowledge for repeated requests. In addition, our findings indicate that the accuracy of the crowdsourced community (veracity) parallels that of the professional sources; and that the crowdsourced fact-checks are perceived quite close to professionals in terms of objectivity, clarity, and persuasiveness (viability).
CITATION STYLE
Zhao, A., & Naaman, M. (2023). Insights from a Comparative Study on the Variety, Velocity, Veracity, and Viability of Crowdsourced and Professional Fact-Checking Services. Journal of Online Trust and Safety, 2(1). https://doi.org/10.54501/jots.v2i1.118
Mendeley helps you to discover research relevant for your work.