Learning a Gradient-free Riemannian Optimizer on Tangent Spaces

4Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

A principal way of addressing constrained optimization problems is to model them as problems on Riemannian manifolds. Recently, Riemannian meta-optimization provides a promising way for solving constrained optimization problems by learning optimizers on Riemannian manifolds in a data-driven fashion, making it possible to design task-specific constrained optimizers. A close look at the Riemannian meta-optimization reveals that learning optimizers on Riemannian manifolds needs to differentiate through the nonlinear Riemannian optimization, which is complex and computationally expensive. In this paper, we propose a simple yet efficient Riemannian meta-optimization method that learns to optimize on tangent spaces of manifolds. In doing so, we present a gradient-free optimizer on tangent spaces, which takes parameters of the model along with the training data as inputs, and generates the updated parameters directly. As a result, the constrained optimization is transformed from Riemannian manifolds to tangent spaces where complex Riemannian operations (e.g., retraction operations) are removed from the optimizer, and learning the optimizer does not need to differentiate through the Riemannian optimization. We empirically show that our method brings efficient learning of the optimizer, while enjoying a good optimization trajectory in a data-driven manner.

References Powered by Scopus

Acquiring linear subspaces for face recognition under variable lighting

2148Citations
N/AReaders
Get full text

Stochastic gradient descent on riemannian manifolds

416Citations
N/AReaders
Get full text

Repmet: Representative-based metric learning for classification and few-shot object detection

320Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Efficient Riemannian Meta-Optimization by Implicit Differentiation

2Citations
N/AReaders
Get full text

Large-scale Riemannian meta-optimization via subspace adaptation

0Citations
N/AReaders
Get full text

Rcoco: contrastive collective link prediction across multiplex network in Riemannian space

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Fan, X., Gao, Z., Wu, Y., Jia, Y., & Harandi, M. (2021). Learning a Gradient-free Riemannian Optimizer on Tangent Spaces. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 8B, pp. 7377–7384). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i8.16905

Readers' Seniority

Tooltip

Lecturer / Post doc 1

100%

Readers' Discipline

Tooltip

Business, Management and Accounting 1

33%

Computer Science 1

33%

Mathematics 1

33%

Save time finding and organizing research with Mendeley

Sign up for free