KLMo: Knowledge Graph Enhanced Pretrained Language Model with Fine-Grained Relationships

16Citations
Citations of this article
56Readers
Mendeley users who have this article in their library.

Abstract

Interactions between entities in knowledge graph (KG) provide rich knowledge for language representation learning. However, existing knowledge-enhanced pretrained language models (PLMs) only focus on entity information and ignore the fine-grained relationships between entities. In this work, we propose to incorporate KG (including both entities and relations) into the language learning process to obtain KG-enhanced pretrained Language Model, namely KLMo. Specifically, a novel knowledge aggregator is designed to explicitly model the interaction between entity spans in text and all entities and relations in a contextual KG. An relation prediction objective is utilized to incorporate relation information by distant supervision. An entity linking objective is further utilized to link entity spans in text to entities in KG. In this way, the structured knowledge can be effectively integrated into language representations. Experimental results demonstrate that KLMo achieves great improvements on several knowledge-driven tasks, such as entity typing and relation classification, comparing with the state-of-the-art knowledge-enhanced PLMs.

Cite

CITATION STYLE

APA

He, L., Zheng, S., Yang, T., & Zhang, F. (2021). KLMo: Knowledge Graph Enhanced Pretrained Language Model with Fine-Grained Relationships. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 4536–4542). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.384

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free