Monetary Interventions in Crowdsourcing Task Switching

23Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

Abstract

With a large amount of tasks of various types, requesters in crowdsourcing platforms often bundle tasks of different types into a single working session. This creates a task switching setting, where workers need to shift between different cognitive tasks. We design and conduct an experiment on Amazon Mechanical Turk to study how occasionally presented performancecontingent monetary rewards, referred as monetary interventions, affect worker performance in the task switching setting. We use two competing metrics to evaluate worker performance. When monetary interventions are placed on some tasks in a working session, our results show that worker performance on these tasks can be improved in both metrics. Moreover, worker performance on other tasks where monetary interventions are not placed is also affected: workers perform better according to one metric, but worse according to the other metric. This suggests that in addition to providing extrinsic monetary incentives for some tasks, monetary interventions implicitly set performance goals for all tasks. Furthermore, monetary interventions are most effective in improving worker performance when used at switch tasks, tasks that follow a task of a different type, in working sessions with a low task switching frequency.

Cite

CITATION STYLE

APA

Yin, M., Chen, Y., & Sun, Y. A. (2014). Monetary Interventions in Crowdsourcing Task Switching. In Proceedings of the 2nd AAAI Conference on Human Computation and Crowdsourcing, HCOMP 2014 (pp. 234–241). AAAI Press. https://doi.org/10.1609/hcomp.v2i1.13160

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free