An overview of numerical acceleration techniques for nonlinear dimension reduction

4Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We are living in an increasingly data-dependent world - making sense of large, high-dimensional data sets is an important task for researchers in academia, industry, and government. Techniques from machine learning, namely nonlinear dimension reduction, seek to organize this wealth of data by extracting descriptive features. These techniques, though powerful in their ability to find compact representational forms, are hampered by their high computational costs. In their naive implementation, this prevents them from processing large modern data collections in a reasonable time or with modest computational means. In this summary article we shall discuss some of the important numerical techniques which drastically increase the computational efficiency of these methods while preserving much of their representational power. Specifically, we address random projections, approximate k-nearest neighborhoods, approximate kernel methods, and approximate matrix decomposition methods.

Cite

CITATION STYLE

APA

Czaja, W., Doster, T., & Halevy, A. (2017). An overview of numerical acceleration techniques for nonlinear dimension reduction. In Applied and Numerical Harmonic Analysis (pp. 797–829). Springer International Publishing. https://doi.org/10.1007/978-3-319-55556-0_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free