Updating kernel methods in spectral decomposition by affinity perturbations

7Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Many machine learning based algorithms contain a training step that is done once. The training step is usually computational expensive since it involves processing of huge matrices. If the training profile is extracted from an evolving dynamic dataset, it has to be updated as some features of the training dataset are changed. This paper proposes a solution how to update this profile efficiently. Therefore, we investigate how to update the training profile when the data is constantly evolving. We assume that the data is modeled by a kernel method and processed by a spectral decomposition. In many algorithms for clustering and classification, a low dimensional representation of the affinity (kernel) graph of the embedded training dataset is computed. Then, it is used for classifying newly arrived data points. We present methods for updating such embeddings of the training datasets in an incremental way without the need to perform the entire computation upon the occurrences of changes in a small number of the training samples. Efficient computation of such an algorithm is critical in many web based applications. © 2012 Elsevier Inc. All rights reserved.

Cite

CITATION STYLE

APA

Shmueli, Y., Wolf, G., & Averbuch, A. (2012). Updating kernel methods in spectral decomposition by affinity perturbations. Linear Algebra and Its Applications, 437(6), 1356–1365. https://doi.org/10.1016/j.laa.2012.04.035

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free