This article offers an empirical exploration on the efficient use of word-level convolutional neural networks (word-CNN) for large-scale text classification. Generally, the word-CNNs are difficult to train on large-scale datasets as the size of word embedding dramatically increases as the size of vocabulary increases. In order to handle this issue, this paper presents a de-noise approach to word embedding. We compare our model with several recently proposed CNN models on publicly available dataset. The experimental results show that proposed method improves the usefulness of word-CNN and increases the accuracy of text classification.
CITATION STYLE
Ma, X., Jin, R., Paik, J. Y., & Chung, T. S. (2018). Large scale text classification with efficient word embedding. In Lecture Notes in Electrical Engineering (Vol. 425, pp. 465–469). Springer Verlag. https://doi.org/10.1007/978-981-10-5281-1_51
Mendeley helps you to discover research relevant for your work.