In many problems of supervised tensor learning, real world data such as face images or MRI scans are naturally represented as matrices, which are also called as second order tensors. Most existing classifiers based on tensor representation, such as support tensor machine and kernelized support tensor machine need to solve iteratively which occupy much time and may suffer from local minima. In this paper, we present a kernel support matrix machine which performs a matrix-form inner product with maximum margin classifier. Specifically, the matrix inner product is introduced to leverage the inherent structural information within matrix data. Further, matrix kernel functions are applied to detect the nonlinear relationships. We analyze a unifying optimization problem for which we propose an asymptotically convergent algorithm. Theoretical analysis for the generalization bounds is derived based on Rademacher complexity with respect to a probability distribution. We demonstrate the merits of the proposed method by exhaustive experiments on both simulation study and a number of real-word datasets from a variety of application domains.
CITATION STYLE
Ye, Y. (2019). A nonlinear kernel support matrix machine for matrix learning. International Journal of Machine Learning and Cybernetics, 10(10), 2725–2738. https://doi.org/10.1007/s13042-018-0896-4
Mendeley helps you to discover research relevant for your work.