Abstract
Event coreference resolution is an important research problem with many applications. Despite the recent remarkable success of pretrained language models, we argue that it is still highly beneficial to utilize symbolic features for the task. However, as the input for coreference resolution typically comes from upstream components in the information extraction pipeline, the automatically extracted symbolic features can be noisy and contain errors. Also, depending on the specific context, some features can be more informative than others. Motivated by these observations, we propose a novel context-dependent gated module to adaptively control the information flows from the input symbolic features. Combined with a simple noisy training method, our best models achieve state-of-the-art results on two datasets: ACE 2005 and KBP 2016.
Cite
CITATION STYLE
Lai, T., Ji, H., Bui, T., Tran, Q. H., Dernoncourt, F., & Chang, W. (2021). A Context-Dependent Gated Module for Incorporating Symbolic Semantics into Event Coreference Resolution. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 3491–3499). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-main.274
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.