We address the task of computing vector space representations for the meaning of word occurrences, which can vary widely according to context. This task is a crucial step towards a robust, vector-based compositional account of sentence meaning. We argue that existing models for this task do not take syntactic structure sufficiently into account. We present a novel structured vector space model that addresses these issues by incorporating the selectional preferences for words' argument positions. This makes it possible to integrate syntax into the computation of word meaning in context. In addition, the model performs at and above the state of the art for modeling the contextual adequacy of paraphrases. © 2008 Association for Computational Linguistics.
CITATION STYLE
Erk, K., & Padó, S. (2008). A structured vector space model for word meaning in context. In EMNLP 2008 - 2008 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference: A Meeting of SIGDAT, a Special Interest Group of the ACL (pp. 897–906). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1613715.1613831
Mendeley helps you to discover research relevant for your work.