Context-Aware Cross-Attention for Non-Autoregressive Translation

30Citations
Citations of this article
85Readers
Mendeley users who have this article in their library.

Abstract

Non-autoregressive translation (NAT) significantly accelerates the inference process by predicting the entire target sequence. However, due to the lack of target dependency modelling in the decoder, the conditional generation process heavily depends on the cross-attention. In this paper, we reveal a localness perception problem in NAT cross-attention, for which it is difficult to adequately capture source context. To alleviate this problem, we propose to enhance signals of neighbour source tokens into conventional cross-attention. Experimental results on several representative datasets show that our approach can consistently improve translation quality over strong NAT baselines. Extensive analyses demonstrate that the enhanced cross-attention achieves better exploitation of source contexts by leveraging both local and global information.

Cite

CITATION STYLE

APA

Ding, L., Wang, L., Wu, D., Tao, D., & Tu, Z. (2020). Context-Aware Cross-Attention for Non-Autoregressive Translation. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 4396–4402). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.389

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free