Learningword importance with the neural bag-of-words model

17Citations
Citations of this article
111Readers
Mendeley users who have this article in their library.

Abstract

The Neural Bag-of-Words (NBOW) model performs classification with an average of the input word vectors and achieves an impressive performance. While the NBOW model learns word vectors targeted for the classification task it does not explicitly model which words are important for given task. In this paper we propose an improved NBOW model with this ability to learn task specific word importance weights. The word importance weights are learned by introducing a new weighted sum composition of the word vectors. With experiments on standard topic and sentiment classification tasks, we show that (a) our proposed model learns meaningful word importance for a given task (b) our model gives best accuracies among the BOW approaches. We also show that the learned word importance weights are comparable to tf-idf based word weights when used as features in a BOWSVM classifier.

Cite

CITATION STYLE

APA

Sheikh, I., Illina, I., Fohr, D., & Linares, G. (2016). Learningword importance with the neural bag-of-words model. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 222–229). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w16-1626

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free