As the amount of unstructured text data that humanity produce largely and a lot of texts are grows on the Internet, so the one of the intelligent technique is require processing it and extracting different types of knowledge from it. Gated recurrent unit (GRU) and support vector machine (SVM) have been successfully used to Natural Language Processing (NLP) systems with comparative, remarkable results. GRU networks perform well in sequential learning tasks and overcome the issues of “vanishing and explosion of gradients in standard recurrent neural networks (RNNs) when captureing long-term dependencies. In this paper, we proposed a text classification model based on improved approaches to this norm by presenting a linear support vector machine (SVM) as the replacement of Softmax in the final output layer of a GRU model. Furthermore, the cross-entropy function shall be replaced with a margin-based function. Empirical results present that the proposed GRU-SVM model achieved comparatively better results than the baseline approaches BLSTM-C, DABN.
CITATION STYLE
Zulqarnain, M., Ghazali, R., Hassim, Y. M. M., & Rehan, M. (2020). Text classification based on gated recurrent unit combines with support vector machine. International Journal of Electrical and Computer Engineering, 10(4), 3734–3742. https://doi.org/10.11591/ijece.v10i4.pp3734-3742
Mendeley helps you to discover research relevant for your work.