HiveMind: Tuning Crowd Response with a Single Value

0Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

One common problem plaguing crowdsourcing tasks is tuning the set of worker responses: Depending on task requirements, requesters may want a large set of rich and varied worker responses (typically in subjective evaluation tasks) or a more convergent response-set (typically for more objective tasks such as fact-checking). This problem is especially salient in tasks that combine workers’ responses to present a single output: Divergence in these settings could either add richness and complexity to the unified answer, or noise. In this paper we present HiveMind, a system of methods that allow requesters to tune different levels of convergence in worker participation for different tasks simply by adjusting the value of one variable.

Cite

CITATION STYLE

APA

Singh, P., Lasecki, W. S., Barelli, P., & Bigham, J. P. (2013). HiveMind: Tuning Crowd Response with a Single Value. In Proceedings of the 1st AAAI Conference on Human Computation and Crowdsourcing, HCOMP 2013 (pp. 66–67). AAAI Press. https://doi.org/10.1609/hcomp.v1i1.13130

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free