Generalized additive models

2.8kCitations
Citations of this article
809Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Likelihood-based regression models such as the normal linear regression model and the linear logistic model, assume a linear (or some other parametric) form for the covariates X1, X2,…, Xp. We introduce the class of generalized additive models which replaces the linear form Σ βjXj by a sum of smooth functions Σ sj(Xj). The sj(·)‘s are unspecified functions that are estimated using a scatterplot smoother, in an iterative procedure we call the local scoring algorithm. The technique is applicable to any likelihood-based regression model: the class of generalized linear models contains many of these. In this class the linear predictor η = Σ βjXj is replaced by the additive predictor Σ sj(Xj); hence, the name generalized additive models. We illustrate the technique with binary response and survival data. In both cases, the method proves to be useful in uncovering nonlinear covariate effects. It has the advantage of being completely automatic, i.e., no "detective work" is needed on the part of the statistician. As a theoretical underpinning, the technique is viewed as an empirical method of maximizing the expected log likelihood, or equivalently, of minimizing the Kullback-Leibler distance to the true model. © 1986, Institute of Mathematical Statistics. All Rights Reserved.

Cite

CITATION STYLE

APA

Hastie, T., & Tibshirani, R. (1986). Generalized additive models. Statistical Science, 1(3), 297–310. https://doi.org/10.1214/ss/1177013604

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free