DATLMedQA: A data augmentation and transfer learning based solution for medical question answering

12Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

Abstract

With the outbreak of COVID-19 that has prompted an increased focus on self-care, more and more people hope to obtain disease knowledge from the Internet. In response to this demand, medical question answering and question generation tasks have become an important part of natural language processing (NLP). However, there are limited samples of medical questions and answers, and the question generation systems cannot fully meet the needs of non-professionals for medical questions. In this research, we propose a BERT medical pretraining model, using GPT-2 for question augmentation and T5-Small for topic extraction, calculating the cosine similarity of the extracted topic and using XGBoost for prediction. With augmentation using GPT-2, the prediction accuracy of our model outperforms the state-of-the-art (SOTA) model performance. Our experiment results demonstrate the outstanding performance of our model in medical question answering and question generation tasks, and its great potential to solve other biomedical question answering challenges.

Cite

CITATION STYLE

APA

Zhou, S., & Zhang, Y. (2021). DATLMedQA: A data augmentation and transfer learning based solution for medical question answering. Applied Sciences (Switzerland), 11(23). https://doi.org/10.3390/app112311251

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free