Big bidirectional insertion representations for documents

5Citations
Citations of this article
76Readers
Mendeley users who have this article in their library.

Abstract

The Insertion Transformer is well suited for long form text generation due to its parallel generation capabilities, requiring O(log2 n) generation steps to generate n tokens. However, modeling long sequences is difficult, as there is more ambiguity captured in the attention mechanism. This work proposes the Big Bidirectional Insertion Representations for Documents (Big BIRD), an insertion-based model for document-level translation tasks. We scale up the insertion-based models to long form documents. Our key contribution is introducing sentence alignment via sentence-positional embeddings between the source and target document. We show an improvement of +4.3 BLEU on the WMT’19 English→German document-level translation task compared with the Insertion Transformer baseline.

Cite

CITATION STYLE

APA

Li, L., & Chan, W. (2019). Big bidirectional insertion representations for documents. In EMNLP-IJCNLP 2019 - Proceedings of the 3rd Workshop on Neural Generation and Translation (pp. 194–198). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d19-5620

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free