Eigendecomposition is the factorization of a matrix into its canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. A common step is the reduction of the data to a kernel matrix also known as a Gram matrix which is used for machine learning tasks. A significant drawback of kernel methods is the computational complexity associated with manipulating kernel matrices. This paper demonstrates that leading eigenvectors derived from singular value decomposition (SVD) and Nyström approximation methods can be utilized for classification tasks without the need to construct Gram matrices. Experiments were conducted with 14 biomedical datasets to compare classifier performance when taking as input into a classifier matrices containing: 1) leading eigenvectors which result from each approximation method, and 2) matrices which result from constructing the patient-by-patient Gram matrix. The results provide evidence to support the main hypothesis of this paper that using the leading eigenvectors as input into a classifier significantly (p< 0.05) improves classifier performance in terms of accuracy and time compared to using Gram matrices. Furthermore, experiments were carried out using large multi-modal mHealth time series datasets of ten different subjects with diverse profiles while they were performing several physical activities. Experiments with the mHealth datasets utilized a sequential deep learning model. The significance of the proposed approach is that it can make feature extraction methods more accessible on large-scale unimodal and multi-modal data which are becoming common in many applications.
CITATION STYLE
Cosma, G., & Martin McGinnity, T. (2019). Feature Extraction and Classification Using Leading Eigenvectors: Applications to Biomedical and Multi-Modal mHealth Data. IEEE Access, 7, 107400–107412. https://doi.org/10.1109/ACCESS.2019.2932868
Mendeley helps you to discover research relevant for your work.