Lexicon infused phrase embeddings for named entity resolution

168Citations
Citations of this article
294Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Most state-of-the-art approaches for named-entity recognition (NER) use semi supervised information in the form of word clusters and lexicons. Recently neural network-based language models have been explored, as they as a byproduct generate highly informative vector representations for words, known as word embeddings. In this paper we present two contributions: a new form of learning word embeddings that can leverage information from relevant lexicons to improve the representations, and the first system to use neural word embeddings to achieve state-of-the-art results on named-entity recognition in both CoNLL and Ontonotes NER. Our system achieves an F1 score of 90.90 on the test set for CoNLL 2003—significantly better than any previous system trained on public data, and matching a system employing massive private industrial query-log data.

Cite

CITATION STYLE

APA

Passos, A., Kumar, V., & McCallum, A. (2014). Lexicon infused phrase embeddings for named entity resolution. In CoNLL 2014 - 18th Conference on Computational Natural Language Learning, Proceedings (pp. 78–86). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/w14-1609

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free