Learning the kernel matrix with low-rank multiplicative shaping

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

Selecting the optimal kernel is an important and difficult challenge in applying kernel methods to pattern recognition. To address this challenge, multiple kernel learning (MKL) aims to learn a kernel from a combination of base kernel functions that perform optimally on the task. In this paper, we propose a novel MKL-themed approach to combine base kernels that are multiplicatively shaped with low-rank positive semidefinitve matrices. The proposed approach generalizes several popular MKL methods and thus provides more flexibility in modeling data. Computationally, we show how these low-rank matrices can be learned efficiently from data using convex quadratic programming. Empirical studies on several standard benchmark datasets for MKL show that the new approach often improves prediction accuracy statistically significantly over very competitive single kernel and other MKL methods. Copyright © 2012, Association for the Advancement of Artificial Intelligence. All rights reserved.

Cite

CITATION STYLE

APA

Levinboim, T., & Sha, F. (2012). Learning the kernel matrix with low-rank multiplicative shaping. In Proceedings of the National Conference on Artificial Intelligence (Vol. 2, pp. 984–990). https://doi.org/10.1609/aaai.v26i1.8306

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free