Crowdsourcing Qualitative Thematic Analysis

13Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Evaluations that include stakeholders aim to understand their perspectives and to ensure that their views are represented. This article offers a new approach to gaining stakeholder perspectives through crowdsourcing. We recruited a sample of individuals with chronic low back pain through a crowdsourcing site. This sample coded textual data describing pain, provided themes, and provided feedback on constructs and procedures. The results generated by the crowdsourced participants were compared to results generated by experts. We found that crowdsourcing to develop code and textual responses was feasible, rapid, and inexpensive, offering the potential to enhance patient stakeholder engagement in evaluation. Crowdsourcing has broad implications for evaluation science beyond the health sector.

Cite

CITATION STYLE

APA

Hilton, L. G., & Azzam, T. (2019). Crowdsourcing Qualitative Thematic Analysis. American Journal of Evaluation, 40(4), 575–589. https://doi.org/10.1177/1098214019836674

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free