Randomized approximate class-specific kernel spectral regression analysis for large-scale face verification

5Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Kernel methods are known to be effective to analyse complex objects by implicitly embedding them into some feature space. The approximate class-specific kernel spectral regression (ACS-KSR) method is a powerful tool for face verification. This method consists of two steps: an eigenanalysis step and a kernel regression step, however, it may suffer from heavily computational overhead in practice, especially for large-sample data sets. In this paper, we propose two randomized algorithms based on the ACS-KSR method. The main contribution of our work is four-fold. First, we point out that the formula utilized in the eigenanalysis step of the ACS-KSR method is mathematically incomplete, and we give a correction to it. Moreover, we consider how to efficiently solve the ratio-trace problem and the trace-ratio problem involved in this method. Second, it is well known that kernel matrix is approximately low-rank, however, to the best of our knowledge, there are few theoretical results that can provide simple and feasible strategies to determine the numerical rank of a kernel matrix without forming it explicitly. To fill-in this gap, we focus on the commonly used Gaussian kernel and provide a practical strategy for determining numerical rank of the kernel matrix. Third, based on numerically low-rank property of the kernel matrix, we propose a modified Nyström method with fixed-rank for the kernel regression step, and establish a probabilistic error bound on the approximation. Fourth, although the proposed Nyström method can reduce the computational cost of the original method, it is required to form and store the reduced kernel matrix explicitly. This is unfavorable to extremely large-sample data sets. To settle this problem, we propose a randomized block Kaczmarz method for kernel regression problem with multiple right-hand sides, in which there is no need to compute and store the reduced kernel matrix explicitly. The convergence of this method is established. Comprehensive numerical experiments on real-world data sets are performed to show the effectiveness of our theoretical results and the efficiency of the proposed methods.

Cite

CITATION STYLE

APA

Li, K., & Wu, G. (2022). Randomized approximate class-specific kernel spectral regression analysis for large-scale face verification. Machine Learning, 111(6), 2037–2091. https://doi.org/10.1007/s10994-022-06140-9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free