Put it back: Entity typing with language model enhancement

19Citations
Citations of this article
124Readers
Mendeley users who have this article in their library.

Abstract

Entity typing aims to classify semantic types of an entity mention in a specific context. Most existing models obtain training data using distant supervision, and inevitably suffer from the problem of noisy labels. To address this issue, we propose entity typing with language model enhancement. It utilizes a language model to measure the compatibility between context sentences and labels, and thereby automatically focuses more on context-dependent labels. Experiments on benchmark datasets demonstrate that our method is capable of enhancing the entity typing model with information from the language model, and significantly outperforms the state-of-the-art baseline. Code and data for this paper can be found from https://github.com/thunlp/LME.

Cite

CITATION STYLE

APA

Xin, J., Zhu, H., Han, X., Liu, Z., & Sun, M. (2018). Put it back: Entity typing with language model enhancement. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, EMNLP 2018 (pp. 993–998). Association for Computational Linguistics. https://doi.org/10.18653/v1/d18-1121

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free