StructFormer: Joint unsupervised induction of dependency and constituency structure from masked language modeling

23Citations
Citations of this article
97Readers
Mendeley users who have this article in their library.

Abstract

There are two major classes of natural language grammars - the dependency grammar that models one-to-one correspondences between words and the constituency grammar that models the assembly of one or several corresponded words. While previous unsupervised parsing methods mostly focus on only inducing one class of grammars, we introduce a novel model, StructFormer, that can simultaneously induce dependency and constituency structure. To achieve this, we propose a new parsing framework that can jointly generate a constituency tree and dependency graph. Then we integrate the induced dependency relations into the transformer, in a differentiable manner, through a novel dependency-constrained self-attention mechanism. Experimental results show that our model can achieve strong results on unsupervised constituency parsing, unsupervised dependency parsing, and masked language modeling at the same time.

Cite

CITATION STYLE

APA

Shen, Y., Tay, Y., Zheng, C., Bahri, D., Metzler, D., & Courville, A. (2021). StructFormer: Joint unsupervised induction of dependency and constituency structure from masked language modeling. In ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 7196–7209). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.acl-long.559

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free