We are living in an increasingly data-dependent world - making sense of large, high-dimensional data sets is an important task for researchers in academia, industry, and government. Techniques from machine learning, namely nonlinear dimension reduction, seek to organize this wealth of data by extracting descriptive features. These techniques, though powerful in their ability to find compact representational forms, are hampered by their high computational costs. In their naive implementation, this prevents them from processing large modern data collections in a reasonable time or with modest computational means. In this summary article we shall discuss some of the important numerical techniques which drastically increase the computational efficiency of these methods while preserving much of their representational power. Specifically, we address random projections, approximate k-nearest neighborhoods, approximate kernel methods, and approximate matrix decomposition methods.
CITATION STYLE
Czaja, W., Doster, T., & Halevy, A. (2017). An overview of numerical acceleration techniques for nonlinear dimension reduction. In Applied and Numerical Harmonic Analysis (pp. 797–829). Springer International Publishing. https://doi.org/10.1007/978-3-319-55556-0_12
Mendeley helps you to discover research relevant for your work.