Large language models for latvian named entity recognition

9Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Transformer-based language models pre-trained on large corpora have demonstrated good results on multiple natural language processing tasks for widely used languages including named entity recognition (NER). In this paper, we investigate the role of the BERT models in the NER task for Latvian. We introduce the BERT model pre-trained on the Latvian language data. We demonstrate that the Latvian BERT model, pre-trained on large Latvian corpora, achieves better results (81.91 F1-measure on average vs 78.37 on M-BERT for a dataset with nine named entity types, and 79.72 vs 78.83 on another dataset with seven types) than multilingual BERT and outperforms previously developed Latvian NER systems.

Cite

CITATION STYLE

APA

Vksna, R., & Skadia, I. (2020). Large language models for latvian named entity recognition. In Frontiers in Artificial Intelligence and Applications (Vol. 328, pp. 62–69). IOS Press BV. https://doi.org/10.3233/faia200603

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free