Languages using Chinese characters are mostly processed at word level. Inspired by recent success of deep learning, we delve deeper to character and radical levels for Chinese language processing. We propose a new deep learning technique, called "radical embedding", with justifications based on Chinese linguistics, and validate its feasibility and utility through a set of three experiments: two in-house standard experiments on short-text categorization (STC) and Chinese word segmentation (CWS), and one in-field experiment on search ranking. We show that radical embedding achieves comparable, and sometimes even better, results than competing methods.
CITATION STYLE
Shi, X., Zhai, J., Yang, X., Xie, Z., & Liu, C. (2015). Radical embedding: Delving deeper to Chinese radicals. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 594–598). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p15-2098
Mendeley helps you to discover research relevant for your work.