LSTM-Based Attentional Embedding for English Machine Translation

N/ACitations
Citations of this article
20Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In order to reduce the workload of manual grading and improve the efficiency of grading, a computerized intelligent grading system for English translation based on natural language processing is designed. An attention-embedded LSTM English machine translation model is proposed. Firstly, according to the characteristics of the standard LSTM network model that uses fixed dimensional vectors to represent words in the encoding stage, an English machine translation model based on LSTM attention embedding is established; the structure level of the English translation scoring system is constructed. A linguistic model of the English translation scoring system is established, and the probability distribution of a particular sentence sequence or word sequence of the translated text is statistically calculated using the model. The results show that the English machine translation model based on LSTM attention embedding proposed in this study can enhance the representation of the source language contextual information and improve the performance of the English machine translation model and the quality of the translation compared with the English machine translation models constructed by existing neural network structures, such as standard LSTM models, RNN models, and GRU-Attention translation models.

Cite

CITATION STYLE

APA

Jian, L., Xiang, H., & Le, G. (2022). LSTM-Based Attentional Embedding for English Machine Translation. Scientific Programming, 2022. https://doi.org/10.1155/2022/3909726

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free