The ACTIVECROWDTOOLKIT: An Open-Source Tool for Benchmarking Active Learning Algorithms for Crowdsourcing Research

19Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

We present an open-source toolkit that allows the easy comparison of the performance of active learning methods over a series of datasets. The toolkit allows such strategies to be constructed by combining a judgement aggregation model, task selection method and worker selection method. The toolkit also provides a user interface which allows researchers to gain insight into worker performance and task classification at runtime.

Cite

CITATION STYLE

APA

Venanzi, M., Parson, O., Rogers, A., & Jennings, N. (2015). The ACTIVECROWDTOOLKIT: An Open-Source Tool for Benchmarking Active Learning Algorithms for Crowdsourcing Research. In Proceedings of the 3rd AAAI Conference on Human Computation and Crowdsourcing, HCOMP 2015 (pp. 44–45). AAAI Press. https://doi.org/10.1609/hcomp.v3i1.13256

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free