Dialogue generation is the automatic generation of a text response, given a user's input. Dialogue generation for low-resource languages has been a challenging tasks for researchers. However, the advancements in deep learning models have made developing conversational agents that perform the tasks of dialogue generation not only possible, but also effective and helpful in many applications spanning a variety of domains. Nevertheless, work on conversational bots for low-resource languages such as the Arabic language is still limited due to various challenges, including the language structure, vocabulary, and the scarcity of its data resources. Meta-learning has been introduced before in the natural language processing (NLP) realm and showed significant improvements in many tasks; however, it has rarely been used in natural language generation (NLG) tasks and never in Arabic NLG. In this work, we propose a meta-learning approach for Arabic dialogue generation for fast adaptation on low-resource domains, namely, Arabic. We start by using existing pre-trained models; we then meta-learn the initial parameters on high-resource dataset before finetuning the parameters on the target tasks. We prove that the proposed model that employs meta-learning techniques improves generalization and enables fast adaptation of the transformer model on low-resource NLG tasks. We report gains in the BLEU-4 and improvements in Semantic textual Similarity (STS) metrics when compared to the existing state-of-the-art approach. We also do a further study on the effectiveness of the meta-learning algorithms on the response generation of the models.
CITATION STYLE
Shamas, M., El Hajj, W., Hajj, H., & Shaban, K. (2023). Metadial: A Meta-learning Approach for Arabic Dialogue Generation. ACM Transactions on Asian and Low-Resource Language Information Processing, 22(6). https://doi.org/10.1145/3590960
Mendeley helps you to discover research relevant for your work.