BERT-MK: Integrating graph contextualized knowledge into pre-trained language models

80Citations
Citations of this article
217Readers
Mendeley users who have this article in their library.

Abstract

Complex node interactions are common in knowledge graphs (KGs), and these interactions can be considered as contextualized knowledge exists in the topological structure of KGs. Traditional knowledge representation learning (KRL) methods usually treat a single triple as a training unit, neglecting the usage of graph contextualized knowledge. To utilize these unexploited graph-level knowledge, we propose an approach to model subgraphs in a medical KG. Then, the learned knowledge is integrated with a pre-trained language model to do the knowledge generalization. Experimental results demonstrate that our model achieves the state-of-the-art performance on several medical NLP tasks, and the improvement above MedERNIE indicates that graph contextualized knowledge is beneficial.

Cite

CITATION STYLE

APA

He, B., Zhou, D., Xiao, J., Jiang, X., Liu, Q., Yuan, N. J., & Xu, T. (2020). BERT-MK: Integrating graph contextualized knowledge into pre-trained language models. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 2281–2290). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.207

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free