HIT-SCIR at MRP 2019: A unified pipeline for meaning representation parsing via efficient training and effective encoding

28Citations
Citations of this article
68Readers
Mendeley users who have this article in their library.

Abstract

This paper describes our system (HIT-SCIR) for the CoNLL 2019 shared task: Cross-Framework Meaning Representation Parsing. We extended the basic transition-based parser with two improvements: a) Efficient Training by realizing stack LSTM parallel training; b) Effective Encoding via adopting deep contextualized word embeddings BERT (Devlin et al., 2019). Generally, we proposed a unified pipeline to meaning representation parsing, including framework-specific transition-based parsers, BERT-enhanced word representation, and post-processing. In the final evaluation, our system was ranked first according to ALL-F1 (86.2%) and especially ranked first in UCCA framework (81.67%).

Cite

CITATION STYLE

APA

Che, W., Dou, L., Xu, Y., Wang, Y., Liu, Y., & Liu, T. (2020). HIT-SCIR at MRP 2019: A unified pipeline for meaning representation parsing via efficient training and effective encoding. In CoNLL 2019 - SIGNLL Conference on Computational Natural Language Learning, Proceedings of the Shared Task on Cross-Framework Meaning Representation Parsing at the 2019 Conference on Natural Language Learning (pp. 76–85). Association for Computational Linguistics. https://doi.org/10.18653/v1/K19-2007

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free