Distributional semantic models (DSMs) have been effective at representing semantics at the word level, and research has recently moved on to building distributional representations for larger segments of text. In this paper, we introduce novel ways of applying context selection and normalisation to vary model sparsity and the range of values of the DSM vectors. We show how these methods enhance the quality of the vectors and thus result in improved low dimensional and composed representations. We demonstrate these effects on standard word and phrase datasets, and on a new definition retrieval task and dataset. © 2014 Association for Computational Linguistics.
CITATION STYLE
Polajnar, T., & Clark, S. (2014). Improving distributional semantic vectors through context selection and normalisation. In 14th Conference of the European Chapter of the Association for Computational Linguistics 2014, EACL 2014 (pp. 230–238). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/e14-1025
Mendeley helps you to discover research relevant for your work.