Recurrent neural word segmentation with tag inference

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we present a Long Short-Term Memory (LSTM) based model for the task of Chinese Weibo word segmentation. The model adopts a LSTM layer to capture long-range dependencies in sentence and learn the underlying patterns. In order to infer the optimal tag path, we introduce a transition score matrix for jumping between tags of successive characters. Integrated with some unsupervised features, the performance of the model is further improved. Finally, our model achieves a weighted F1-score of 0.8044 on close track, 0.8298 on the semi-open track.

Author supplied keywords

Cite

CITATION STYLE

APA

Zhou, Q., Ma, L., Zheng, Z., Wang, Y., & Wang, X. (2016). Recurrent neural word segmentation with tag inference. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10102, pp. 734–743). Springer Verlag. https://doi.org/10.1007/978-3-319-50496-4_66

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free