Abstract
Background: No systematic evaluation of smartphone/mobile apps for resuscitation training and real incident support is available to date. To provide medical, usability, and additional quality criteria for the development of apps, we conducted a mixed-methods sequential evaluation combining the perspective of medical experts and end-users. Objective: The study aims to assess the quality of current mobile apps for cardiopulmonary resuscitation (CPR) training and real incident support from expert as well as end-user perspective. Methods: Two independent medical experts evaluated the medical content of CPR apps from the Google Play store and the Apple App store. The evaluation was based on pre-defined minimum medical content requirements according to current Basic Life Support (BLS) guidelines. In a second phase, non-medical end-users tested usability and appeal of the apps that had at least met the minimum requirements. Usability was assessed with the System Usability Scale (SUS); appeal was measured with the self-developed ReactionDeck toolkit. Results: Out of 61 apps, 46 were included in the experts' evaluation. A consolidated list of 13 apps resulted for the following layperson evaluation. The interrater reliability was substantial (kappa=.61). Layperson end-users (n=14) had a high interrater reliability (intraclass correlation 1 [ICC1]=.83, P
Author supplied keywords
Cite
CITATION STYLE
Kalz, M., Lenssen, N., Felzen, M., Rossaint, R., Tabuenca, B., Specht, M., & Skorning, M. (2014). Smartphone apps for cardiopulmonary resuscitation training and real incident support: A mixed-methods evaluation study. Journal of Medical Internet Research, 16(3). https://doi.org/10.2196/jmir.2951
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.