Background: Rule-base clinical decision support alerts are known to malfunction, but tools for discovering malfunctions are limited. Objective: Investigate whether user override comments can be used to discover malfunctions. Methods: We manually classified all rules in our database with at least 10 override comments into 3 categories based on a sample of override comments: broken, not broken, but could be improved, and not broken. We used 3 methods (frequency of comments, cranky word list heuristic, and a Nave Bayes classifier trained on a sample of comments) to automatically rank rules based on features of their override comments. We evaluated each ranking using the manual classification as truth. Results: Of the rules investigated, 62 were broken, 13 could be improved, and the remaining 45 were not broken. Frequency of comments performed worse than a random ranking, with precision at 20 of 8 and AUC0.487. The cranky comments heuristic performed better with precision at 20 of 16 and AUC0.723. The Nave Bayes classifier had precision at 20 of 17 and AUC0.738. Discussion: Override comments uncovered malfunctions in 26% of all rules active in our system. This is a lower bound on total malfunctions andmuch higher than expected. Even for low-resource organizations, reviewing comments identified by the cranky word list heuristic may be an effective and feasible way of finding broken alerts. Conclusion: Override comments are a rich data source for finding alerts that are broken or could be improved. If possible, we recommend monitoring all override comments on a regular basis.
CITATION STYLE
Aaron, S., McEvoy, D. S., Ray, S., Hickman, T. T. T., & Wright, A. (2019). Cranky comments: Detecting clinical decision support malfunctions through free-text override reasons. Journal of the American Medical Informatics Association, 26(1), 37–43. https://doi.org/10.1093/jamia/ocy139
Mendeley helps you to discover research relevant for your work.