Absolute Penalty Estimation

  • Ahmed E
  • Raheem E
  • Hossain S
N/ACitations
Citations of this article
56Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In statistics, the technique of least squares is used for estimating the unknown parameters in a linear regres-sion model (see Linear Regression Models). is method minimizes the sum of squared distances between the observed responses in a set of data, and the responses from the regression model. Suppose we observe a collec-tion of data {y i , x i } n i= on n units, where y i s are responses and x i = (x i , x i , . . . , x ip) T is a vector of predictors. It is convenient to write the model in matrix notation, as, y = Xβ + ε, () where y is n ×  vector of responses, X is n × p matrix, known as the design matrix, β = (β  , β  , . . . , β p) T is the unknown parameter vector and ε is the vector of random errors. In ordinary least squares (OLS) regression, we esti-mate β by minimizing the residual sum of squares, RSS = (y − Xβ)

Cite

CITATION STYLE

APA

Ahmed, E. S., Raheem, E., & Hossain, S. (2011). Absolute Penalty Estimation. In International Encyclopedia of Statistical Science (pp. 1–3). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-04898-2_102

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free