Ontology-aware token embeddings for prepositional phrase attachment

19Citations
Citations of this article
120Readers
Mendeley users who have this article in their library.

Abstract

Type-level word embeddings use the same set of parameters to represent all instances of a word regardless of its context, ignoring the inherent lexical ambiguity in language. Instead, we embed semantic concepts (or synsets) as defined in WordNet and represent a word token in a particular context by estimating a distribution over relevant semantic concepts. We use the new, context-sensitive embeddings in a model for predicting prepositional phrase (PP) attachments and jointly learn the concept embeddings and model parameters. We show that using context-sensitive embeddings improves the accuracy of the PP attachment model by 5.4% absolute points, which amounts to a 34.4% relative reduction in errors.

Cite

CITATION STYLE

APA

Dasigi, P., Ammar, W., Dyer, C., & Hovy, E. (2017). Ontology-aware token embeddings for prepositional phrase attachment. In ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (Vol. 1, pp. 2089–2098). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/P17-1191

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free