Improving neural fine-grained entity typing with knowledge attention

55Citations
Citations of this article
134Readers
Mendeley users who have this article in their library.

Abstract

Fine-grained entity typing aims to identify the semantic type of an entity in a particular plain text. It is an important task which can be helpful for a lot of natural language processing (NLP) applications. Most existing methods typically extract features separately from the entity mention and context words for type classification. These methods inevitably fail to model complex correlations between entity mentions and context words. They also neglect rich background information about these entities in knowledge bases (KBs). To address these issues, we take information from KBs into consideration to bridge entity mentions and their context together, and thereby propose Knowledge-Attention Neural Fine-Grained Entity Typing. Experimental results and case studies on real-world datasets demonstrate that our model significantly outperforms other state-of-the-art methods, revealing the effectiveness of incorporating KB information for entity typing. Code and data for this paper can be found at https://github.com/thunlp/KNET.

Cite

CITATION STYLE

APA

Xin, J., Lin, Y., Liu, Z., & Sun, M. (2018). Improving neural fine-grained entity typing with knowledge attention. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 5997–6004). AAAI press. https://doi.org/10.1609/aaai.v32i1.12038

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free