Learning embeddings of entities and relations is an efficient and versatile method to perform machine learning on relational data such as knowledge graphs. In this work, we propose holographic embeddings (HOLE) to learn compositional vector space representations of entire knowledge graphs. The proposed method is related to holographic models of associative memory in that it employs circular correlation to create compositional representations. By using correlation as the compositional operator, HOLE can capture rich interactions but simultaneously remains efficient to compute, easy to train, and scalable to very large datasets. Experimentally, we show that holographic embeddings are able to outperform state-ofthe-Art methods for link prediction on knowledge graphs and relational learning benchmark datasets.
CITATION STYLE
Nickel, M., Rosasco, L., & Poggio, T. (2016). Holographic embeddings of knowledge graphs. In 30th AAAI Conference on Artificial Intelligence, AAAI 2016 (pp. 1955–1961). AAAI press. https://doi.org/10.1609/aaai.v30i1.10314
Mendeley helps you to discover research relevant for your work.