Literature reviews allow scientists to stand on the shoulders of giants, showing promising directions, summarizing progress, and pointing out existing challenges in research. At the same time conducting a systematic literature review is a laborious and consequently expensive process. In the last decade, there have been several studies on crowdsourcing in literature reviews. This paper explores the feasibility of crowdsourcing for facilitating the literature review process in terms of results, time and effort, and identifies which crowdsourcing strategies provide the best results based on the budget available. In particular we focus on the screening phase of the literature review process and we contribute and assess strategies for running crowdsourcing tasks that are efficient in terms of budget and classification error. Finally, we present our findings based on experiments run on Crowdflower.
CITATION STYLE
Krivosheev, E., Casati, F., Caforio, V., & Benatallah, B. (2017). Crowdsourcing Paper Screening in Systematic Literature Reviews. In Proceedings of the 5th AAAI Conference on Human Computation and Crowdsourcing, HCOMP 2017 (pp. 108–117). AAAI Press. https://doi.org/10.1609/hcomp.v5i1.13302
Mendeley helps you to discover research relevant for your work.