A theoretical analysis of semi-supervised learning

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We analyze the dynamical behaviors of semi-supervised learning in the framework of on-line learning by using the statistical-mechanical method. A student uses several correlated input vectors in each update. The student is given a desired output for only one input vector out of these correlated input vectors. In this model, we derive simultaneous differential equations with deterministic forms that describe the dynamical behaviors of order parameters using the self-averaging property in the thermodynamic limit. We treat the Hebbian and Perceptron learning rules. As a result, it is shown that using unlabeled data is effective in the early stages for both of the two learning rules. In addition, we show that the two learning rules have qualitatively different dynamical behaviors. Furthermore, we propose a new algorithm that improves the generalization performance by switching the number of input vectors used in an update as the time step proceeds.

Cite

CITATION STYLE

APA

Fujii, T., Ito, H., & Miyoshi, S. (2016). A theoretical analysis of semi-supervised learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9948 LNCS, pp. 28–36). Springer Verlag. https://doi.org/10.1007/978-3-319-46672-9_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free