Bidirectional recurrent neural network approach for arabic named entity recognition

34Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.

Abstract

Recurrent neural network (RNN) has achieved remarkable success in sequence labeling tasks with memory requirement. RNN can remember previous information of a sequence and can thus be used to solve natural language processing (NLP) tasks. Named entity recognition (NER) is a common task of NLP and can be considered a classification problem. We propose a bidirectional long short-term memory (LSTM) model for this entity recognition task of the Arabic text. The LSTM network can process sequences and relate to each part of it, which makes it useful for the NER task. Moreover, we use pre-trained word embedding to train the inputs that are fed into the LSTM network. The proposed model is evaluated on a popular dataset called "ANERcorp." Experimental results show that the model with word embedding achieves a high F-score measure of approximately 88.01%.

Cite

CITATION STYLE

APA

Ali, M. N. A., Tan, G., & Hussain, A. (2018). Bidirectional recurrent neural network approach for arabic named entity recognition. Future Internet, 10(12). https://doi.org/10.3390/fi10120123

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free