How Can Transformer Models Shape Future Healthcare: A Qualitative Study

1Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

Transformer models have been successfully applied to various natural language processing and machine translation tasks in recent years, e.g. automatic language understanding. With the advent of more efficient and reliable models (e.g. GPT-3), there is a growing potential for automating time-consuming tasks that could be of particular benefit in healthcare to improve clinical outcomes. This paper aims at summarizing potential use cases of transformer models for future healthcare applications. Precisely, we conducted a survey asking experts on their ideas and reflections for future use cases. We received 28 responses, analyzed using an adapted thematic analysis. Overall, 8 use case categories were identified including documentation and clinical coding, workflow and healthcare services, decision support, knowledge management, interaction support, patient education, health management, and public health monitoring. Future research should consider developing and testing the application of transformer models for such use cases.

Cite

CITATION STYLE

APA

Denecke, K., May, R., & Rivera Romero, O. (2023). How Can Transformer Models Shape Future Healthcare: A Qualitative Study. Studies in Health Technology and Informatics, 309, 43–47. https://doi.org/10.3233/SHTI230736

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free