Extensions of the informative vector machine

25Citations
Citations of this article
54Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The informative vector machine (IVM) is a practical method for Gaussian process regression and classification. The IVM produces a sparse approximation to a Gaussian process by combining assumed density filtering with a heuristic for choosing points based on minimizing posterior entropy. This paper extends IVM in several ways. First, we propose a novel noise model that allows the IVM to be applied to a mixture of labeled and unlabeled data. Second, we use IVM on a block-diagonal covariance matrix, for "learning to learn" from related tasks. Third, we modify the IVM to incorporate prior knowledge from known invariances. All of these extensions are tested on artificial and real data. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Lawrence, N. D., Platt, J. C., & Jordan, M. I. (2005). Extensions of the informative vector machine. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3635 LNAI, pp. 56–87). https://doi.org/10.1007/11559887_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free