The Catline for Deep Regression

18Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Motivated by the notion of regression depth (Rousseeuw and Hubert, 1996) we introduce thecatline, a new method for simple linear regression. At any bivariate data setZn={(xi,yi);i=1,...,n} its regression depth is at leastn/3. This lower bound is attained for data lying on a convex or concave curve, whereas for perfectly linear data the catline attains a depth ofn. We construct anO(nlogn) algorithm for the catline, so it can be computed fast in practice. The catline is Fisher-consistent at any linear modely=βx+α+ein which the error distribution satisfies med(ex)=0, which encompasses skewed and/or heteroscedastic errors. The breakdown value of the catline is 1/3, and its influence function is bounded. At the bivariate gaussian distribution its asymptotic relative efficiency compared to theL1line is 79.3% for the slope, and 88.9% for the intercept. The finite-sample relative efficiencies are in close agreement with these values. This combination of properties makes the catline an attractive fitting method. © 1998 Academic Press.

Cite

CITATION STYLE

APA

Hubert, M., & Rousseeuw, P. J. (1998). The Catline for Deep Regression. Journal of Multivariate Analysis, 66(2), 270–296. https://doi.org/10.1006/jmva.1998.1751

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free