Fine-grained entity type classification by jointly learning representations and label embeddings

73Citations
Citations of this article
142Readers
Mendeley users who have this article in their library.

Abstract

Fine-grained entity type classification (FETC) is the task of classifying an entity mention to a broad set of types. Distant supervision paradigm is extensively used to generate training data for this task. However, generated training data assigns same set of labels to every mention of an entity without considering its local context. Existing FETC systems have two major drawbacks: Assuming training data to be noise free and use of hand crafted features. Our work overcomes both drawbacks. We propose a neural network model that jointly learns entity mentions and their context representation to eliminate use of hand crafted features. Our model treats training data as noisy and uses non-parametric variant of hinge loss function. Experiments show that the proposed model outperforms previous stateof- The-art methods on two publicly available datasets, namely FIGER(GOLD) and BBN with an average relative improvement of 2.69% in micro-F1 score. Knowledge learnt by our model on one dataset can be transferred to other datasets while using same model or other FETC systems. These approaches of transferring knowledge further improve the performance of respective models.

Cite

CITATION STYLE

APA

Abhishek, Anand, A., & Awekar, A. (2017). Fine-grained entity type classification by jointly learning representations and label embeddings. In 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference (Vol. 2, pp. 797–807). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/e17-1075

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free