HittER: Hierarchical Transformers for Knowledge Graph Embeddings

76Citations
Citations of this article
158Readers
Mendeley users who have this article in their library.

Abstract

This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity's neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Experimental results show that HittER achieves new state-of-the-art results on multiple link prediction datasets. We additionally propose a simple approach to integrate HittER into BERT and demonstrate its effectiveness on two Freebase factoid question answering datasets.

Cite

CITATION STYLE

APA

Chen, S., Liu, X., Gao, J., Jiao, J., Zhang, R., & Ji, Y. (2021). HittER: Hierarchical Transformers for Knowledge Graph Embeddings. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 10395–10407). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.812

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free