CrowdUtility: A Recommendation System for Crowdsourcing Platforms

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Crowd workers exhibit varying work patterns, expertise, and quality leading to wide variability in the performance of crowdsourcing platforms. The onus of choosing a suitable platform to post tasks is mostly with the requester, often leading to poor guarantees and unmet requirements due to the dynamism in performance of crowd platforms. Towards this end, we demonstrate CrowdUtility, a statistical model-ling based tool for evaluating multiple crowdsourcing plat-forms and recommending a platform that best suits the re-quirements of the requester. CrowdUtility uses an online Multi-Armed Bandit framework, to schedule tasks while op-timizing platform performance. We demonstrate an end-to end system starting from requirements specification, to plat-form recommendation, to real-time monitoring.

Cite

CITATION STYLE

APA

Chander, D., Bhattacharya, S., Celis, E., Dasgupta, K., Karanam, S., Rajan, V., & Gupta, A. (2014). CrowdUtility: A Recommendation System for Crowdsourcing Platforms. In Proceedings of the 2nd AAAI Conference on Human Computation and Crowdsourcing, HCOMP 2014 (pp. 69–70). AAAI Press. https://doi.org/10.1609/hcomp.v2i1.13138

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free