Fully Hyperbolic Neural Networks

40Citations
Citations of this article
192Readers
Mendeley users who have this article in their library.

Abstract

Hyperbolic neural networks have shown great potential for modeling complex data. However, existing hyperbolic networks are not completely hyperbolic, as they encode features in the hyperbolic space yet formalize most of their operations in the tangent space (a Euclidean subspace) at the origin of the hyperbolic model. This hybrid method greatly limits the modeling ability of networks. In this paper, we propose a fully hyperbolic framework to build hyperbolic networks based on the Lorentz model by adapting the Lorentz transformations (including boost and rotation) to formalize essential operations of neural networks. Moreover, we also prove that linear transformation in tangent spaces used by existing hyperbolic networks is a relaxation of the Lorentz rotation and does not include the boost, implicitly limiting the capabilities of existing hyperbolic networks. The experimental results on four NLP tasks show that our method has better performance for building both shallow and deep networks. Our code is released to facilitate follow-up research.

References Powered by Scopus

OpenNMT: Open-source toolkit for neural machine translation

1016Citations
N/AReaders
Get full text

Riemannian center of mass and mollifier smoothing

830Citations
N/AReaders
Get full text

Hyperbolic geometry of complex networks

804Citations
N/AReaders
Get full text

Cited by Powered by Scopus

VLP: A Survey on Vision-language Pre-training

135Citations
N/AReaders
Get full text

Self-Supervised Continual Graph Learning in Adaptive Riemannian Spaces

22Citations
N/AReaders
Get full text

CONGREGATE: Contrastive Graph Clustering in Curvature Spaces

15Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Chen, W., Han, X., Lin, Y., Zhao, H., Liu, Z., Li, P., … Zhou, J. (2022). Fully Hyperbolic Neural Networks. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 5672–5686). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.389

Readers over time

‘18‘19‘20‘21‘22‘23‘24‘25020406080

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 63

66%

Researcher 26

27%

Professor / Associate Prof. 3

3%

Lecturer / Post doc 3

3%

Readers' Discipline

Tooltip

Computer Science 77

78%

Engineering 13

13%

Mathematics 6

6%

Neuroscience 3

3%

Save time finding and organizing research with Mendeley

Sign up for free
0