On the degrees of freedom in shape-restricted regression

111Citations
Citations of this article
324Readers
Mendeley users who have this article in their library.

Abstract

For the problem of estimating a regression function, μ say, subject to shape constraints, like monotonicity or convexity, it is argued that the divergence of the maximum likelihood estimator provides a useful measure of the effective dimension of the model. Inequalities are derived for the expected mean squared error of the maximum likelihood estimator and the expected residual sum of squares. These generalize equalities from the case of linear regression. As an application, it is shown that the maximum likelihood estimator of the error variance σ2 is asymptotically normal with mean σ2 and variance 2σ2/n. For monotone regression, it is shown that the maximum likelihood estimator of μ attains the optimal rate of convergence, and a bias correction to the maximum likelihood estimator of σ2 is derived.

Cite

CITATION STYLE

APA

Meyer, M., & Woodroofe, M. (2000). On the degrees of freedom in shape-restricted regression. Annals of Statistics, 28(4), 1083–1104. https://doi.org/10.1214/aos/1015956708

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free