Stochastic non-convex ordinal embedding with stabilized Barzilai-Borwein step size

16Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

Learning representation from relative similarity comparisons, often called ordinal embedding, gains rising attention in recent years. Most of the existing methods are batch methods designed mainly based on the convex optimization, say, the projected gradient descent method. However, they are generally time-consuming due to that the singular value decomposition (SVD) is commonly adopted during the update, especially when the data size is very large. To overcome this challenge, we propose a stochastic algorithm called SVRG-SBB, which has the following features: (a) SVD-free via dropping convexity, with good scalability by the use of stochastic algorithm, i.e., stochastic variance reduced gradient (SVRG), and (b) adaptive step size choice via introducing a new stabilized Barzilai-Borwein (SBB) method as the original version for convex problems might fail for the considered stochastic non-convex optimization problem. Moreover, we show that the proposed algorithm converges to a stationary point at a rate O( T1 ) in our setting, where T is the number of total iterations. Numerous simulations and real-world data experiments are conducted to show the effectiveness of the proposed algorithm via comparing with the state-of-the-art methods, particularly, much lower computational cost with good prediction performance.

Cite

CITATION STYLE

APA

Ma, K., Zeng, J., Xiong, J., Xu, Q., Cao, X., Liu, W., & Yao, Y. (2018). Stochastic non-convex ordinal embedding with stabilized Barzilai-Borwein step size. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 3738–3745). AAAI press. https://doi.org/10.1609/aaai.v32i1.11599

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free