Evaluating the crowd with confidence

65Citations
Citations of this article
105Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Worker quality control is a crucial aspect of crowdsourcing systems; typically occupying a large fraction of the time and money invested on crowdsourcing. In this work, we devise techniques to generate confidence intervals for worker error rate estimates, thereby enabling a better evaluation of worker quality. We show that our techniques generate correct confidence intervals on a range of real-world datasets, and demonstrate wide applicability by using them to evict poorly performing workers, and provide confidence intervals on the accuracy of the answers.

Author supplied keywords

Cite

CITATION STYLE

APA

Joglekar, M., Garcia-Molina, H., & Parameswaran, A. (2013). Evaluating the crowd with confidence. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (Vol. Part F128815, pp. 686–694). Association for Computing Machinery. https://doi.org/10.1145/2487575.2487595

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free