Self-weighted multiple kernel learning for graph-based clustering and semi-supervised classification

81Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.

Abstract

Multiple kernel learning (MKL) method is generally believed to perform better than single kernel method. However, some empirical studies show that this is not always true: the combination of multiple kernels may even yield an even worse performance than using a single kernel. There are two possible reasons for the failure: (i) most existing MKL methods assume that the optimal kernel is a linear combination of base kernels, which may not hold true; and (ii) some kernel weights are inappropriately assigned due to noises and carelessly designed algorithms. In this paper, we propose a novel MKL framework by following two intuitive assumptions: (i) each kernel is a perturbation of the consensus kernel; and (ii) the kernel that is close to the consensus kernel should be assigned a large weight. Impressively, the proposed method can automatically assign an appropriate weight to each kernel without introducing additional parameters, as existing methods do. The proposed framework is integrated into a unified framework for graph-based clustering and semi-supervised classification. We have conducted experiments on multiple benchmark datasets and our empirical results verify the superiority of the proposed framework.

Cite

CITATION STYLE

APA

Kang, Z., Lu, X., Yi, J., & Xu, Z. (2018). Self-weighted multiple kernel learning for graph-based clustering and semi-supervised classification. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 2312–2318). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/320

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free