YANMTT: Yet another neural machine translation toolkit

3Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we present our open-source neural machine translation (NMT) toolkit called "Yet Another Neural Machine Translation Toolkit" abbreviated as YANMTT1 which is built on top of the HuggingFace Transformers library. YANMTT aims to enable pre-Training and fine-Tuning of sequence-To-sequence models with ease. It can be used for training parameter-heavy models with minimal parameter sharing and efficient, lightweight models via heavy parameter sharing. Additionally, efficient fine-Tuning can be done via finegrained tuning parameter selection, adapter and prompt tuning. Our toolkit also comes with a user interface that can be used to demonstrate these models and visualize the attention and embedding representations. Apart from these core features, our toolkit also provides other advanced functionalities such as but not limited to document/multi-source NMT, simultaneous NMT, mixtures-of-experts and model compression.

Cite

CITATION STYLE

APA

Dabre, R., Kanojia, D., Sawant, C., & Sumita, E. (2023). YANMTT: Yet another neural machine translation toolkit. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 3, pp. 257–263). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-demo.24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free