Fast end-to-end coreference resolution for Korean

7Citations
Citations of this article
59Readers
Mendeley users who have this article in their library.

Abstract

Recently, end-to-end neural network-based approaches have shown significant improvements over traditional pipeline-based models in English coreference resolution. However, such advancements came at a cost of computational complexity and recent works have not focused on tackling this problem. Hence, in this paper, to cope with this issue, we propose BERT-SRU-based Pointer Networks that leverages the linguistic property of head-final languages. Applying this model to the Korean coreference resolution, we significantly reduce the coreference linking search space. Combining this with Ensemble Knowledge Distillation, we maintain state-of-the-art performance 66.9% of CoNLL F1 on ETRI test set while achieving 2x speedup (30 doc/sec) in document processing time.

Cite

CITATION STYLE

APA

Park, C., Shin, J., Park, S., Lim, J., & Lee, C. (2020). Fast end-to-end coreference resolution for Korean. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 2610–2624). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.237

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free