Recurrent Neural CRF for Aspect Term Extraction with Dependency Transmission

5Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents a novel neural architecture for aspect term extraction in fine-grained sentiment computing area. In addition to amalgamating sequential features (character embedding, word embedding and POS tagging information), we train an end-to-end Recurrent Neural Networks (RNNs) with meticulously designed dependency transmission between recurrent units, thereby making it possible to learn structural syntactic phenomena. The experimental results show that incorporating these shallow semantic features improves aspect term extraction performance compared to a system that uses no linguistic information, demonstrating the utility of morphological information and syntactic structures for capturing the affinity between aspect words and their contexts.

Cite

CITATION STYLE

APA

Guo, L., Jiang, S., Du, W., & Gan, S. (2018). Recurrent Neural CRF for Aspect Term Extraction with Dependency Transmission. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11108 LNAI, pp. 378–390). Springer Verlag. https://doi.org/10.1007/978-3-319-99495-6_32

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free