Crowdsourcing with unsure option

7Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

One of the fundamental issues in crowdsourcing is the trade-off between the number of workers needed for high-accuracy aggregation and the budget to pay. To save cost, it is important to ensure high quality of the crowd-sourced labels, hence the total cost on label collection will be reduced. Since the confidence of the workers often has a close relationship with their abilities, a possible way for quality control is to request the workers to return the labels only when they feel confident, by means of providing them with an ‘unsure’ option. On the other hand, allowing workers to choose the unsure option can potentially waste part of the budget. In this work, we conduct an analysis towards understanding when providing the unsure option indeed leads to significant cost reduction, as well as how the confidence threshold might be set. We also propose an online mechanism, which is an alternative for threshold selection when the estimation of the crowd ability distribution is difficult.

Cite

CITATION STYLE

APA

Ding, Y. X., & Zhou, Z. H. (2018). Crowdsourcing with unsure option. Machine Learning, 107(4), 749–766. https://doi.org/10.1007/s10994-017-5677-x

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free