Locally fitting hyperplanes to high-dimensional data

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Problems such as data compression, pattern recognition and artificial intelligence often deal with a large data sample as observations of an unknown object. An effective method is proposed to fit hyperplanes to data points in each hypercubic subregion of the original data sample. Corresponding to a set of affine linear manifolds, the locally fitted hyperplanes optimally approximate the object in the sense of least squares of their perpendicular distances to the sample points. Its effectiveness and versatility are illustrated through approximation of nonlinear manifolds Möbius strip and Swiss roll, handwritten digit recognition, dimensionality reduction in a cosmological application, inter/extrapolation for a social and economic data set, and prediction of recidivism of criminal defendants. Based on two essential concepts of hyperplane fitting and spatial data segmentation, this general method for unsupervised learning is rigorously derived. The proposed method requires no assumptions on the underlying object and its data sample. Also, it has only two parameters, namely the size of segmenting hypercubes and the number of fitted hyperplanes for user to choose. These make the proposed method considerably accessible when applied to solving various problems in real applications.

Cite

CITATION STYLE

APA

Hou, M., & Kambhampati, C. (2022). Locally fitting hyperplanes to high-dimensional data. Neural Computing and Applications, 34(11), 8885–8896. https://doi.org/10.1007/s00521-022-06909-y

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free