MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NER

70Citations
Citations of this article
87Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Data augmentation is an effective solution to data scarcity in low-resource scenarios. However, when applied to token-level tasks such as NER, data augmentation methods often suffer from token-label misalignment, which leads to unsatsifactory performance. In this work, we propose Masked Entity Language Modeling (MELM) as a novel data augmentation framework for low-resource NER. To alleviate the token-label misalignment issue, we explicitly inject NER labels into sentence context, and thus the fine-tuned MELM is able to predict masked entity tokens by explicitly conditioning on their labels. Thereby, MELM generates high-quality augmented data with novel entities, which provides rich entity regularity knowledge and boosts NER performance. When training data from multiple languages are available, we also integrate MELM with code-mixing for further improvement. We demonstrate the effectiveness of MELM on monolingual, cross-lingual and multilingual NER across various low-resource levels. Experimental results show that our MELM presents substantial improvement over the baseline methods.

References Powered by Scopus

Neural architectures for named entity recognition

2591Citations
N/AReaders
Get full text

Improving neural machine translation models with monolingual data

1645Citations
N/AReaders
Get full text

Multilingual denoising pre-training for neural machine translation

1214Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Revisiting DocRED - Addressing the False Negative Problem in Relation Extraction

58Citations
N/AReaders
Get full text

PromptNER: Prompt Locating and Typing for Named Entity Recognition

25Citations
N/AReaders
Get full text

Few-shot biomedical named entity recognition via knowledge-guided instance generation and prompt contrastive learning

20Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zhou, R., Li, X., He, R., Bing, L., Cambria, E., Si, L., & Miao, C. (2022). MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NER. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 2251–2262). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.160

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 21

75%

Researcher 5

18%

Professor / Associate Prof. 1

4%

Lecturer / Post doc 1

4%

Readers' Discipline

Tooltip

Computer Science 30

86%

Linguistics 3

9%

Neuroscience 1

3%

Engineering 1

3%

Save time finding and organizing research with Mendeley

Sign up for free