A Study on Self-attention Mechanism for AMR-to-text Generation

3Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Introduced by Vaswani et al., transformer architecture, with the effective use of self-attention mechanism, has shown outstanding performance in translating sequence of text from one language to another. In this paper, we conduct experiments using the self-attention in converting an abstract meaning representation (AMR) graph, a semantic representation, into a natural language sentence, also known as the task of AMR-to-text generation. On the benchmark dataset for this task, we obtain promising results comparing to existing deep learning methods in the literature.

Cite

CITATION STYLE

APA

Sinh, V. T., & Minh, N. L. (2019). A Study on Self-attention Mechanism for AMR-to-text Generation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11608 LNCS, pp. 321–328). Springer Verlag. https://doi.org/10.1007/978-3-030-23281-8_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free