Abstract
Recent task-oriented dialog systems obtained great successes in building personal assistants for high resource language such as English, but extending these systems to a global audience is challenging due to the need for annotated data or machine translation systems in the target language. An alternative approach is to leverage existing data in a high-resource language to enable cross-lingual transfer in low-resource language models. However, this type of transfer has not been widely explored in natural language response generation. In this research, we investigate the use of state-of-the-art multilingual models such as mBART and T5 to facilitate zero-shot and few-shot transfer of code-switched responses. We propose a new adapter-based framework that allows for efficient transfer by learning jointly the task-specific, source and target language representations. Our framework is able to successfully transfer language knowledge even when the target language corpus is limited. We present both quantitative and qualitative analyses to evaluate the effectiveness and limitations of our approach.
Cite
CITATION STYLE
Wu, T. W., Zhao, C., Chang, E., Shi, Y., Chuang, P., Chandra, V., & Juang, B. (2023). Towards Zero-Shot Multilingual Transfer for Code-Switched Responses. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 7551–7563). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.417
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.