Recursive top-down production for sentence generation with latent trees

1Citations
Citations of this article
66Readers
Mendeley users who have this article in their library.

Abstract

We model the recursive production property of context-free grammars for natural and synthetic languages. To this end, we present a dynamic programming algorithm that marginalises over latent binary tree structures with N leaves, allowing us to compute the likelihood of a sequence of N tokens under a latent tree model, which we maximise to train a recursive neural function. We demonstrate performance on two synthetic tasks: SCAN (Lake and Baroni, 2017), where it outperforms previous models on the LENGTH split, and English question formation (McCoy et al., 2020), where it performs comparably to decoders with the ground-truth tree structure. We also present experimental results on German-English translation on the Multi30k dataset (Elliott et al., 2016), and qualitatively analyse the induced tree structures our model learns for the SCAN tasks and the German-English translation task.

Cite

CITATION STYLE

APA

Tan, S., Shen, Y., O’Donnell, T. J., Sordoni, A., & Courville, A. (2020). Recursive top-down production for sentence generation with latent trees. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 2291–2307). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.208

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free