Universal dependencies according to BERT: Both more specific and more general

6Citations
Citations of this article
65Readers
Mendeley users who have this article in their library.

Abstract

This work focuses on analyzing the form and extent of syntactic abstraction captured by BERT by extracting labeled dependency trees from self-attentions. Previous work showed that individual BERT heads tend to encode particular dependency relation types. We extend these findings by explicitly comparing BERT relations to Universal Dependencies (UD) annotations, showing that they often do not match one-to-one. We suggest a method for relation identification and syntactic tree construction. Our approach produces significantly more consistent dependency trees than previous work, showing that it better explains the syntactic abstractions in BERT. At the same time, it can be successfully applied with only a minimal amount of supervision and generalizes well across languages.

Cite

CITATION STYLE

APA

Limisiewicz, T., Mareček, D., & Rosa, R. (2020). Universal dependencies according to BERT: Both more specific and more general. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 2710–2722). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.245

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free