In the last few years, there has been a surge of interest in learning representations of entities and relations in knowledge graph (KG). However, the recent availability of temporal knowledge graphs (TKGs) that contain time information for each fact created the need for reasoning over time in such TKGs. In this regard, we present a new approach of TKG embedding, TeRo, which defines the temporal evolution of entity embedding as a rotation from the initial time to the current time in the complex vector space. Specially, for facts involving time intervals, each relation is represented as a pair of dual complex embeddings to handle the beginning and the end of the relation, respectively. We show our proposed model overcomes the limitations of the existing KG embedding models and TKG embedding models and has the ability of learning and inferring various relation patterns over time. Experimental results on four different TKGs show that TeRo significantly outperforms existing state-of-the-art models for link prediction. In addition, we analyze the effect of time granularity on link prediction over TKGs, which as far as we know has not been investigated in previous literature.
CITATION STYLE
Xu, C., Nayyeri, M., Alkhoury, F., Yazdi, H. S., & Lehmann, J. (2020). TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 1583–1593). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.139
Mendeley helps you to discover research relevant for your work.