JBNU at MRP 2019: Multi-level biaffine attention for semantic dependency parsing

4Citations
Citations of this article
67Readers
Mendeley users who have this article in their library.

Abstract

This paper describes Jeonbuk National University (JBNU)'s system for the 2019 shared task on Cross-Framework Meaning Representation Parsing (MRP 2019) at the Conference on Computational Natural Language Learning. Of the five frameworks, we address only the DELPH-IN MRS Bi-Lexical Dependencies (DP), Prague Semantic Dependencies (PSD), and Universal Conceptual Cognitive Annotation (UCCA) frameworks. We propose a unified parsing model using biaffine attention (Dozat and Manning, 2017), consisting of 1) a BERT-BiLSTM encoder and 2) a biaffine attention decoder. First, the BERT-BiLSTM for sentence encoder uses BERT to compose a sentence's wordpieces into word-level embeddings and subsequently applies BiLSTM to word-level representations. Second, the biaffine attention decoder determines the scores for an edge's existence and its labels based on biaffine attention functions between roledependent representations. We also present multi-level biaffine attention models by combining all the role-dependent representations that appear at multiple intermediate layers.

Cite

CITATION STYLE

APA

Na, S. H., Min, J., Park, K., Shin, J. H., & Kim, Y. K. (2020). JBNU at MRP 2019: Multi-level biaffine attention for semantic dependency parsing. In CoNLL 2019 - SIGNLL Conference on Computational Natural Language Learning, Proceedings of the Shared Task on Cross-Framework Meaning Representation Parsing at the 2019 Conference on Natural Language Learning (pp. 95–103). Association for Computational Linguistics. https://doi.org/10.18653/v1/K19-2009

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free