Nowadays, the methodologies based on Reproducing Kernel Hilbert Space (RKHS) embeddings have been gaining importance in machine learning. In tasks such as time series classification, there is a tendency to construct classifiers based on RKHS metrics to find separability among classes, identifying an appropriate RKHS tuning characteristic kernel hyperparameters. In most applications, the characteristic kernel hyperparameter is adjusted based on cross-validation heuristic techniques. These approaches require the construction of a grid of possible values in order to evaluate the performance regarding each of these, which can lead to inaccurate values because the optimal value can not necessarily be contained in the grid. Also, this may involve a computational expense and a high computation time. We propose to use the information potential variations (IPV) from a Parzen-based probability density estimator. Specifically, we search for an RKHS by optimizing the global kernel hyperparameter which describes the IPV. Our methodology is tested on time series classification using a well-known RHKS metric called Maximum Mean Discrepancy (MMD) with a 1-NN classifier. Results show that our strategy allows estimating suitable RKHSs favoring data separability and achieving competitive results in terms of the average classification accuracy.
CITATION STYLE
Valencia, C. K., Álvarez, A., Valencia, E. A., Álvarez, M. A., & Orozco, Á. (2019). Information potential variability for hyperparameter selection in the MMD distance. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11401 LNCS, pp. 279–286). Springer Verlag. https://doi.org/10.1007/978-3-030-13469-3_33
Mendeley helps you to discover research relevant for your work.