Paragraph-level hierarchical neural machine translation

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neural Machine Translation (NMT) has achieved great developments in recent years, but we still have to face two challenges: establishing a high-quality corpus and exploring optimal parameters of models for long text translation. In this paper, we first attempt to set up a paragraph-parallel corpus based on English and Chinese versions of the novels and then design a hierarchical model for it to handle these two challenges. Our encoder and decoder take all the sentences of a paragraph as input to process the words, sentences, paragraphs at different levels, particularly with a two-layer transformer. The bottom transformer of encoder and decoder is used as another level of abstraction, conditioning on its own previous hidden states. Experimental results show that our hierarchical model significantly outperforms seven competitive baselines, including ensembles.

Cite

CITATION STYLE

APA

Zhang, Y., Meng, K., & Liu, G. (2019). Paragraph-level hierarchical neural machine translation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11955 LNCS, pp. 328–339). Springer. https://doi.org/10.1007/978-3-030-36718-3_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free