Adaptive ridge is a special form of ridge regression, balancing the quadratic penalization on each parameter of the model. This paper shows the equivalence between adaptive ridge and lasso (least absolute shrinkage and selection operator). This equivalence states that both procedures produce the same estimate. Least absolute shrinkage can thus be viewed as a particular quadratic penalization. From this observation, we derive an EM algorithm to compute the lasso solution. We finally present a series of applications of this type of algorithm in regression problems: kernel regression, additive modeling and neural net training.
CITATION STYLE
Grandvalet, Y. (1998). Least Absolute Shrinkage is Equivalent to Quadratic Penalization (pp. 201–206). https://doi.org/10.1007/978-1-4471-1599-1_27
Mendeley helps you to discover research relevant for your work.