Text classification based on gated recurrent unit combines with support vector machine

21Citations
Citations of this article
54Readers
Mendeley users who have this article in their library.

Abstract

As the amount of unstructured text data that humanity produce largely and a lot of texts are grows on the Internet, so the one of the intelligent technique is require processing it and extracting different types of knowledge from it. Gated recurrent unit (GRU) and support vector machine (SVM) have been successfully used to Natural Language Processing (NLP) systems with comparative, remarkable results. GRU networks perform well in sequential learning tasks and overcome the issues of “vanishing and explosion of gradients in standard recurrent neural networks (RNNs) when captureing long-term dependencies. In this paper, we proposed a text classification model based on improved approaches to this norm by presenting a linear support vector machine (SVM) as the replacement of Softmax in the final output layer of a GRU model. Furthermore, the cross-entropy function shall be replaced with a margin-based function. Empirical results present that the proposed GRU-SVM model achieved comparatively better results than the baseline approaches BLSTM-C, DABN.

Cite

CITATION STYLE

APA

Zulqarnain, M., Ghazali, R., Hassim, Y. M. M., & Rehan, M. (2020). Text classification based on gated recurrent unit combines with support vector machine. International Journal of Electrical and Computer Engineering, 10(4), 3734–3742. https://doi.org/10.11591/ijece.v10i4.pp3734-3742

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free