Online manifold regularization: A new learning setting and empirical study

  • Goldberg A
  • Li M
  • Zhu X
  • 67

    Readers

    Mendeley users who have this article in their library.
  • 30

    Citations

    Citations of this article.

Abstract

We consider a novel “online semi-supervised learning” setting where (mostly unlabeled) data arrives sequentially in large volume, and it is impractical to store it all before learning. We propose an online manifold regularization algorithm. It differs from standard online learning in that it learns even when the input point is unlabeled. Our algorithm is based on convex programming in kernel space with stochastic gradient descent, and inherits the theoretical guarantees of standard online algorithms. However, naïve implementation of our algorithm does not scale well. This paper focuses on efficient, practical approximations; we discuss two sparse approximations using buffering and online random projection trees. Experiments show our algorithm achieves risk and generalization accuracy comparable to standard batch manifold regularization, while each step runs quickly. Our online semi-supervised learning setting is an interesting direction for further theoretical development, paving the way for semi-supervised learning to work on real-world life-long learning tasks.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • Andrew B. Goldberg

  • Ming Li

  • Xiaojin Zhu

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free