Tensor decomposition aims to factorize an input tensor into a number of latent factors. Due to the low-rank nature of tensor in real applications, the latent factors can be used to perform tensor completion in numerous tasks, such as knowledge graph completion and timely recommendation. However, existing works solve the problem in Euclidean space, where the tensor is decomposed into Euclidean vectors. Recent studies show that hyperbolic space is roomier than Euclidean space. With the same dimension, a hyperbolic vector can represent richer information (e.g., hierarchical structure) than a Euclidean vector. In this paper, we propose to decompose tensor in hyperbolic space. Considering that the most popular optimization tools (e.g, SGD, Adam) have not been generalized in hyperbolic space, we design an adaptive optimization algorithm according to the distinctive property of hyperbolic manifold. To address the non-convex property of the problem, we adopt gradient ascent in our optimization algorithm to avoid getting trapped in local optimal landscapes. We conduct experiments on various tensor completion tasks and the result validates the superiority of our method over these baselines that solve the problem in Euclidean space.
CITATION STYLE
Hui, B., & Ku, W. S. (2022). Low-rank Nonnegative Tensor Decomposition in Hyperbolic Space. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 646–654). Association for Computing Machinery. https://doi.org/10.1145/3534678.3539317
Mendeley helps you to discover research relevant for your work.