Phrase embedding aims at representing phrases in a vector space and it is important for the performance of many NLP tasks. Existing models only regard a phrase as either full-compositional or non-compositional, while ignoring the hybridcompositionality that widely exists, especially in long phrases. This drawback prevents them from having a deeper insight into the semantic structure for long phrases and as a consequence, weakens the accuracy of the embeddings. In this paper, we present a novel method for jointly learning compositionality and phrase embedding by adaptively weighting different compositions using an implicit hierarchical structure. Our model has the ability of adaptively adjusting among different compositions without entailing too much model complexity and time cost. To the best of our knowledge, our work is the first effort that considers hybridcompositionality in phrase embedding. The experimental evaluation demonstrates that our model outperforms state-of-the-art methods in both similarity tasks and analogy tasks.
CITATION STYLE
Li, B., Yang, X., Wang, B., Wang, W., Cui, W., & Zhang, X. (2018). An adaptive hierarchical compositional model for Phrase embedding. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 4144–4151). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/576
Mendeley helps you to discover research relevant for your work.