Introduced by Vaswani et al., transformer architecture, with the effective use of self-attention mechanism, has shown outstanding performance in translating sequence of text from one language to another. In this paper, we conduct experiments using the self-attention in converting an abstract meaning representation (AMR) graph, a semantic representation, into a natural language sentence, also known as the task of AMR-to-text generation. On the benchmark dataset for this task, we obtain promising results comparing to existing deep learning methods in the literature.
CITATION STYLE
Sinh, V. T., & Minh, N. L. (2019). A Study on Self-attention Mechanism for AMR-to-text Generation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11608 LNCS, pp. 321–328). Springer Verlag. https://doi.org/10.1007/978-3-030-23281-8_27
Mendeley helps you to discover research relevant for your work.