Nonlinear maximum margin multi-view learning with adaptive kernel

5Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Existing multi-view learning methods based on kernel function either require the user to select and tune a single predefined kernel or have to compute and store many Gram matrices to perform multiple kernel learning. Apart from the huge consumption of manpower, computation and memory resources, most of these models seek point estimation of their parameters, and are prone to overfitting to small training data. This paper presents an adaptive kernel nonlinear max-margin multi-view learning model under the Bayesian framework. Specifically, we regularize the posterior of an efficient multiview latent variable model by explicitly mapping the latent representations extracted from multiple data views to a random Fourier feature space where max-margin classification constraints are imposed. Assuming these random features are drawn from Dirichlet process Gaussian mixtures, we can adaptively learn shift-invariant kernels from data according to Bochners theorem. For inference, we employ the data augmentation idea for hinge loss, and design an efficient gradient-based MCMC sampler in the augmented space. Having no need to compute the Gram matrix, our algorithm scales linearly with the size of training set. Extensive experiments on real-world datasets demonstrate that our method has superior performance.

Cite

CITATION STYLE

APA

He, J., Du, C., Du, C., Zhuang, F., He, Q., & Long, G. (2017). Nonlinear maximum margin multi-view learning with adaptive kernel. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 0, pp. 1830–1836). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2017/254

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free