Enhanced word representations for bridging anaphora resolution

12Citations
Citations of this article
94Readers
Mendeley users who have this article in their library.

Abstract

Most current models of word representations (e.g., GloVe) have successfully captured finegrained semantics. However, semantic similarity exhibited in these word embeddings is not suitable for resolving bridging anaphora, which requires the knowledge of associative similarity (i.e., relatedness) instead of semantic similarity information between synonyms or hypernyms. We create word embeddings (embeddings PP) to capture such relatedness by exploring the syntactic structure of noun phrases. We demonstrate that using embeddings PP alone achieves around 30% of accuracy for bridging anaphora resolution on the ISNotes corpus. Furthermore, we achieve a substantial gain over the state-of-the-art system (Hou et al., 2013b) for bridging antecedent selection.

Cite

CITATION STYLE

APA

Hou, Y. (2018). Enhanced word representations for bridging anaphora resolution. In NAACL HLT 2018 - 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference (Vol. 2, pp. 1–7). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/n18-2001

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free