DREC: Towards a datasheet for reporting experiments in crowdsourcing

6Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Factors such as instructions, payment schemes, platform demographics, along with strategies for mapping studies into crowdsourcing environments, play an important role in the reproducibility of results. However, inferring these details from scientific articles is often a challenging endeavor, calling for the development of proper reporting guidelines. This paper makes the first steps towards this goal, by describing an initial taxonomy of relevant attributes for crowdsourcing experiments, and providing a glimpse into the state of reporting by analyzing a sample of CSCW papers.

Cite

CITATION STYLE

APA

Ramírez, J., Baez, M., Casati, F., Cernuzzi, L., & Benatallah, B. (2020). DREC: Towards a datasheet for reporting experiments in crowdsourcing. In Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW (pp. 377–382). Association for Computing Machinery. https://doi.org/10.1145/3406865.3418318

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free