KYB General Machine Translation Systems for WMT23

3Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper describes our approach to constructing a neural machine translation system for the WMT 2023 general machine translation shared task. Our model is based on the Transformer architecture's base settings. We optimize system performance through various strategies. Enhancing our model's capabilities involves fine-tuning the pretrained model with an extended dataset. To further elevate translation quality, specialized pre- and post-processing techniques are deployed. Our central focus is on efficient model training, aiming for exceptional accuracy through the synergy of a compact model and curated data. We also performed ensembling augmented by N-best ranking, for both directions of English to Japanese and Japanese to English translation.

Cite

CITATION STYLE

APA

Kalkar, S., Li, B., & Matsuzaki, Y. (2023). KYB General Machine Translation Systems for WMT23. In Conference on Machine Translation - Proceedings (pp. 137–142). Association for Computational Linguistics. https://doi.org/10.18653/v1/2023.wmt-1.10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free