An adaptive hierarchical compositional model for Phrase embedding

14Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

Abstract

Phrase embedding aims at representing phrases in a vector space and it is important for the performance of many NLP tasks. Existing models only regard a phrase as either full-compositional or non-compositional, while ignoring the hybridcompositionality that widely exists, especially in long phrases. This drawback prevents them from having a deeper insight into the semantic structure for long phrases and as a consequence, weakens the accuracy of the embeddings. In this paper, we present a novel method for jointly learning compositionality and phrase embedding by adaptively weighting different compositions using an implicit hierarchical structure. Our model has the ability of adaptively adjusting among different compositions without entailing too much model complexity and time cost. To the best of our knowledge, our work is the first effort that considers hybridcompositionality in phrase embedding. The experimental evaluation demonstrates that our model outperforms state-of-the-art methods in both similarity tasks and analogy tasks.

Cite

CITATION STYLE

APA

Li, B., Yang, X., Wang, B., Wang, W., Cui, W., & Zhang, X. (2018). An adaptive hierarchical compositional model for Phrase embedding. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 4144–4151). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/576

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free