Coreference Resolution without Span Representations

49Citations
Citations of this article
95Readers
Mendeley users who have this article in their library.

Abstract

The introduction of pretrained language models has reduced many complex task-specific NLP models to simple lightweight layers. An exception to this trend is coreference resolution, where a sophisticated task-specific model is appended to a pretrained transformer encoder. While highly effective, the model has a very large memory footprint - primarily due to dynamically-constructed span and span-pair representations - which hinders the processing of complete documents and the ability to train on multiple instances in a single batch. We introduce a lightweight end-to-end coreference model that removes the dependency on span representations, handcrafted features, and heuristics. Our model performs competitively with the current standard model, while being simpler and more efficient.

Cite

CITATION STYLE

APA

Kirstain, Y., Ram, O., & Levy, O. (2021). Coreference Resolution without Span Representations. In ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 14–19). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.acl-short.3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free