Transformers for Low-resource Neural Machine Translation

4Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The recent advances in neural machine translation enable it to be state-of-the-art. However, although there are significant improvements in neural machine translation for a few high-resource languages, its performance is still low for less-resourced languages as the amount of training data significantly affects the quality of the machine translation models. Therefore, identifying a neural machine translation architecture that can train the best models in low-data conditions is essential for less-resourced languages. This research modified the Transformer-based neural machine translation architectures for low-resource polysynthetic languages. Our proposed system outperformed the strong baseline in the automatic evaluation of the experiments on the public benchmark datasets.

Cite

CITATION STYLE

APA

Gezmu, A. M., & Nürnberger, A. (2022). Transformers for Low-resource Neural Machine Translation. In International Conference on Agents and Artificial Intelligence (Vol. 1, pp. 459–466). Science and Technology Publications, Lda. https://doi.org/10.5220/0010971500003116

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free