Anderson–Darling Tests of Goodness-of-Fit

  • Anderson T
N/ACitations
Citations of this article
52Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In statistics, the technique of least squares is used for estimating the unknown parameters in a linear regres-sion model (see Linear Regression Models). is method minimizes the sum of squared distances between the observed responses in a set of data, and the responses from the regression model. Suppose we observe a collec-tion of data {y i , x i } n i= on n units, where y i s are responses and x i = (x i , x i , . . . , x ip) T is a vector of predictors. It is convenient to write the model in matrix notation, as, y = Xβ + ε, () where y is n ×  vector of responses, X is n × p matrix, known as the design matrix, β = (β  , β  , . . . , β p) T is the unknown parameter vector and ε is the vector of random errors. In ordinary least squares (OLS) regression, we esti-mate β by minimizing the residual sum of squares, RSS = (y − Xβ)

Cite

CITATION STYLE

APA

Anderson, T. W. (2011). Anderson–Darling Tests of Goodness-of-Fit. In International Encyclopedia of Statistical Science (pp. 52–54). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-04898-2_118

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free