Evaluations that include stakeholders aim to understand their perspectives and to ensure that their views are represented. This article offers a new approach to gaining stakeholder perspectives through crowdsourcing. We recruited a sample of individuals with chronic low back pain through a crowdsourcing site. This sample coded textual data describing pain, provided themes, and provided feedback on constructs and procedures. The results generated by the crowdsourced participants were compared to results generated by experts. We found that crowdsourcing to develop code and textual responses was feasible, rapid, and inexpensive, offering the potential to enhance patient stakeholder engagement in evaluation. Crowdsourcing has broad implications for evaluation science beyond the health sector.
CITATION STYLE
Hilton, L. G., & Azzam, T. (2019). Crowdsourcing Qualitative Thematic Analysis. American Journal of Evaluation, 40(4), 575–589. https://doi.org/10.1177/1098214019836674
Mendeley helps you to discover research relevant for your work.