Sequence-Level Training for Non-Autoregressive Neural Machine Translation

29Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In recent years, Neural Machine Translation (NMT) has achieved notable results in various translation tasks. However, the word-by-word generation manner determined by the autoregressive mechanism leads to high translation latency of the NMT and restricts its low-latency applications. Non-Autoregressive Neural Machine Translation (NAT) removes the autoregressive

Cite

CITATION STYLE

APA

Shao, C., Feng, Y., Zhang, J., Meng, F., & Zhou, J. (2021). Sequence-Level Training for Non-Autoregressive Neural Machine Translation. Computational Linguistics, 47(4), 891–925. https://doi.org/10.1162/COLI_a_00421

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free