Deep Learning Transformer Architecture for Named-Entity Recognition on Low-Resourced Languages: State of the art results

6Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

Abstract

This paper reports on the evaluation of Deep Learning (DL) transformer architecture models for Named-Entity Recognition (NER) on ten low-resourced South African (SA) languages. In addition, these DL transformer models were com-pared to other Neural Network and Machine Learning (ML) NER models. The findings show that transformer models substantially improve performance when applying discrete fine-tuning parameters per language. Furthermore, fine-tuned transformer models outperform other neural network and ma-chine learning models on NER with the low-resourced SA languages. For example, the transformer models obtained the highest F-scores for six of the ten SA languages and the highest average F-score surpassing the Conditional Random Fields ML model. Practical implications include developing high-performance NER capability with less effort and resource costs, potentially improving downstream NLP tasks such as Machine Translation (MT). Therefore, the application of DL trans-former architecture models for NLP NER sequence tagging tasks on low-resourced SA languages is viable. Additional re-search could evaluate the more recent transformer architecture models on other Natural Language Processing tasks and applications, such as Phrase chunking, MT, and Part-of-Speech tagging.

Cite

CITATION STYLE

APA

Hanslo, R. (2022). Deep Learning Transformer Architecture for Named-Entity Recognition on Low-Resourced Languages: State of the art results. In Proceedings of the 17th Conference on Computer Science and Intelligence Systems, FedCSIS 2022 (pp. 53–60). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.15439/2022F53

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free