The deep multiple kernel learning (DMKL) method has caused widespread concern due to its better results compared with shallow multiple kernel learning. However, existing DMKL methods, which have a fixed number of layers and fixed type of kernels, have poor ability to adapt to different data sets and are difficult to find suitable model parameters to improve the test accuracy. In this paper, we propose a self-adaptive deep multiple kernel learning (SA-DMKL) method. Our SA-DMKL method can adapt the model through optimizing the model parameters of each kernel function with a grid search method and change the numbers and types of kernel function in each layer according to the generalization bound that is evaluated with Rademacher chaos complexity. Experiments on the three datasets of University of California-Irvine (UCI) and image dataset Caltech 256 validate the effectiveness of the proposed method on three aspects.
CITATION STYLE
Ren, S., Shen, W., Siddique, C. N., & Li, Y. (2019). Self-adaptive deep multiple kernel learning based on rademacher complexity. Symmetry, 11(3). https://doi.org/10.3390/sym11030325
Mendeley helps you to discover research relevant for your work.