Hyper-parameter optimization in neural-based translation systems: A case study

0Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Machine translation (MT) is an important use case in natural language processing (NLP) that converts a source language to a target language automatically. Modern intelligent system or artificial intelligence (AI) uses a machine learning approach and the machine has acquired learning ability using datasets. Nowadays, in the MT domain, the neural machine translation (NMT) system has almost replaced the statistical machine translation (SMT) system. The NMT systems use a deep learning framework in their implementation. To achieve higher accuracy during the training of the NMT model, extensive hyper-parameter tuning is required. The paper highlights the significance of hyper-parameter tuning in various machine learning algorithms. And as a case study, in-house experimentation was conducted on a low-resource English-Bangla language pair by designing an NMT system and the significance of various hyper-parameter optimizations was analyzed while evaluating its performance with an automatic metric BLEU. The BLEU scores obtained for the first, second, and third randomly picked test sentences are 4.1, 3.2, and 3.01, respectively.

Cite

CITATION STYLE

APA

Datta, G., Joshi, N., & Gupta, K. (2023). Hyper-parameter optimization in neural-based translation systems: A case study. International Journal on Smart Sensing and Intelligent Systems, 16(1). https://doi.org/10.2478/ijssis-2023-0010

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free