Data Compression and Regression Based on Local Principal Curves

  • Einbeck J
  • Evers L
  • Hinchliff K
N/ACitations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Frequently the predictor space of a multivariate regression problem of the type y = m(x(1), ...., x(p)) + epsilon is intrinsically one-dimensional, or at least of far lower dimension than p. Usual modeling attempts such as the additive model y = m(1) (x(1)) +...+ m(p) (x(p)) + epsilon, which try to reduce the complexity of the regression problem by making additional structural assumptions, are then inefficient as they ignore the inherent structure of the predictor space and involve complicated model and variable selection stages. In a fundamentally different approach, one may consider first approximating the predictor space by a (usually nonlinear) curve passing through it, and then regressing the response only against the one-dimensional projections onto this curve. This entails the reduction from a p- to a one-dimensional regression problem. As a tool for the compression of the predictor space we apply local principal curves. Taking things on from the results presented in Einbeck et al. (Classification - The Ubiquitous Challenge. Springer, Heidelberg, 2005, pp. 256-263), we show how local principal curves can be parametrized and how the projections are obtained. The regression step can then be carried out using any nonparametric smoother. We illustrate the technique using data from the physical sciences.

Cite

CITATION STYLE

APA

Einbeck, J., Evers, L., & Hinchliff, K. (2009). Data Compression and Regression Based on Local Principal Curves (pp. 701–712). https://doi.org/10.1007/978-3-642-01044-6_64

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free