The objective of this paper is to introduce to Technologies of linear dimension reduction popularly known as Principal Component Analysis and Linear Discriminant Analysis. PCA reduces the size of data and conserve maximum variance in the form of new variable called principal components where LDA works with minimum class distance and maximizing difference between the classes. Axis of maximum variance is found by PCA while axis of class separability is found by LDA. This method is experimented over and MNIST handwritten digit data set. Our conclusion explains PCA can outperform LDA when training data set a small and recalls values with lesser computational complexity. The present in linear techniques in this paper presents clear understanding and methods in comparative manner.
CITATION STYLE
Sheikh, R., & Patel, M. (2019). Handwritten digit recognition using different dimensionality reduction techniques. International Journal of Recent Technology and Engineering, 8(2), 999–1002. https://doi.org/10.35940/ijrte.B1798.078219
Mendeley helps you to discover research relevant for your work.