Crowdsourced Feedback to Improve Resident Physician Error Disclosure Skills A Randomized Clinical Trial

8Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

IMPORTANCE Residents must prepare for effective communication with patients after medical errors. The video-based communication assessment (VCA) is software that plays video of a patient scenario, asks the physician to record what they would say, engages crowdsourced laypeople to rate audio recordings of physician responses, and presents feedback to physicians. OBJECTIVE To evaluate the effectiveness of VCA feedback in resident error disclosure skill training. DESIGN, SETTING, AND PARTICIPANTS This single-blinded, randomized clinical trial was conducted from July 2022 to May 2023 at 7 US internal medicine and family medicine residencies (10 total sites). Participants were second-year residents attending required teaching conferences. Data analysis was performed from July to December 2023. INTERVENTION Residents completed 2 VCA cases at time 1 and were randomized to the intervention, an individual feedback report provided in the VCA application after 2 weeks, or to control, in which feedback was not provided until after time 2. Residents completed 2 additional VCA cases after 4 weeks (time 2). MAIN OUTCOMES AND MEASURES Panels of crowdsourced laypeople rated recordings of residents disclosing simulated medical errors to create scores on a 5-point scale. Reports included learning points derived from layperson comments. Mean time 2 ratings were compared to test the hypothesis that residents who had access to feedback on their time 1 performance would score higher at time 2 than those without feedback access. Residents were surveyed about demographic characteristics, disclosure experience, and feedback use. The intervention’s effect was examined using analysis of covariance. RESULTS A total of 146 residents (87 [60.0%] aged 25-29 years; 60 female [41.0%]) completed the time 1 VCA, and 103 (70.5%) completed the time 2 VCA (53 randomized to intervention and 50 randomized to control); of those, 28 (54.9%) reported reviewing their feedback. Analysis of covariance found a significant main effect of feedback between intervention and control groups at time 2 (mean [SD] score, 3.26 [0.45] vs 3.14 [0.39]; difference, 0.12; 95% CI, 0.08-0.48; P = .01). In post hoc comparisons restricted to residents without prior disclosure experience, intervention residents scored higher than those in the control group at time 2 (mean [SD] score, 3.33 [0.43] vs 3.09 [0.44]; difference, 0.24; 95% CI, 0.01-0.48; P = .007). Worse performance at time 1 was associated with increased likelihood of dropping out before time 2 (odds ratio, 2.89; 95% CI, 1.06-7.84; P = .04). CONCLUSIONS AND RELEVANCE In this randomized clinical trial, self-directed review of crowdsourced feedback was associated with higher ratings of internal medicine and family medicine residents’ error disclosure skill, particularly for those without real-life error disclosure experience, suggesting that such feedback may be an effective way for residency programs to address their requirement to prepare trainees for communicating with patients after medical harm.

Cite

CITATION STYLE

APA

White, A. A., King, A. M., D’Addario, A. E., Brigham, K. B., Bradley, J. M., Gallagher, T. H., & Mazor, K. M. (2024). Crowdsourced Feedback to Improve Resident Physician Error Disclosure Skills A Randomized Clinical Trial. JAMA Network Open, 7(8), e2425923. https://doi.org/10.1001/jamanetworkopen.2024.25923

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free