AALRSMF: An adaptive learning rate schedule for matrix factorization

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Stochastic gradient descent (SGD) is an effective algorithm to solve matrix factorization problem. However, the performance of SGD depends critically on how learning rates are tuned over time. In this paper, we propose a novel per-dimension learning rate schedule called AALRSMF. This schedule relies on local gradients, requires no manual tunning of a global learning rate, and shows to be robust to the selection of hyper-parameters. The extensive experiments demonstrate that the proposed schedule shows promising results compared to existing ones on matrix factorization.

Cite

CITATION STYLE

APA

Wei, F., Guo, H., Cheng, S., & Jiang, F. (2016). AALRSMF: An adaptive learning rate schedule for matrix factorization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9932 LNCS, pp. 410–413). Springer Verlag. https://doi.org/10.1007/978-3-319-45817-5_36

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free