Local contexts are effective for neural aspect extraction

6Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recently, long short-term memory based recurrent neural network (LSTM-RNN), which is capable of capturing long dependencies over sequence, obtained state-of-the-art performance on aspect extraction. In this work, we would like to investigate to which extent could we achieve if we only take into account of the local dependencies. To this end, we develop a simple feed-forward neural network which takes a window of context words surrounding the aspect to be processed. Surprisingly, we find that a purely window-based neural network obtain comparable performance with a LSTM-RNN approach, which reveals the importance of local contexts for aspect extraction. Furthermore, we introduce a simple and natural way to leverage local contexts and global contexts together, which is not only computationally cheaper than existing LSTM-RNN approach, but also gets higher classification accuracy.

Cite

CITATION STYLE

APA

Yuan, J., Zhao, Y., Qin, B., & Liu, T. (2017). Local contexts are effective for neural aspect extraction. In Communications in Computer and Information Science (Vol. 774, pp. 244–255). Springer Verlag. https://doi.org/10.1007/978-981-10-6805-8_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free