Attention strategies for multi-Source sequence-to-Sequence learning

129Citations
Citations of this article
196Readers
Mendeley users who have this article in their library.

Abstract

Modeling attention in neural multi-source sequence-to-sequence learning remains a relatively unexplored area, despite its usefulness in tasks that incorporate multiple source languages or modalities. We propose two novel approaches to combine the outputs of attention mechanisms over each source sequence, flat and hierarchical. We compare the proposed methods with existing techniques and present results of systematic evaluation of those methods on the WMT16 Multimodal Translation and Automatic Post-editing tasks. We show that the proposed methods achieve competitive results on both tasks.

Cite

CITATION STYLE

APA

Libovický, J., & Helcl, J. (2017). Attention strategies for multi-Source sequence-to-Sequence learning. In ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (Vol. 2, pp. 196–202). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/P17-2031

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free