Modeling Compositionality with Dependency Graph for Dialogue Generation

1Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Because of the compositionality of natural language, syntactic structure which contains the information about the relationship between words is a key factor for semantic understanding. However, the widely adopted Transformer is hard to learn the syntactic structure effectively in dialogue generation tasks. To explicitly model the compositionaity of language in Transformer Block, we restrict the information flow between words by constructing directed dependency graph and propose Dependency Relation Attention (DRA). Experimental results demonstrate that DRA can further improve the performance of state-of-the-art models for dialogue generation.

Cite

CITATION STYLE

APA

Chen, X., Chen, Y., Xing, X., Xu, X., Han, W., & Tie, Q. (2022). Modeling Compositionality with Dependency Graph for Dialogue Generation. In SUKI 2022 - Workshop on Structured and Unstructured Knowledge Integration, Proceedings of the Workshop (pp. 9–16). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.suki-1.2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free