Least Absolute Shrinkage is Equivalent to Quadratic Penalization

  • Grandvalet Y
N/ACitations
Citations of this article
39Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Adaptive ridge is a special form of ridge regression, balancing the quadratic penalization on each parameter of the model. This paper shows the equivalence between adaptive ridge and lasso (least absolute shrinkage and selection operator). This equivalence states that both procedures produce the same estimate. Least absolute shrinkage can thus be viewed as a particular quadratic penalization. From this observation, we derive an EM algorithm to compute the lasso solution. We finally present a series of applications of this type of algorithm in regression problems: kernel regression, additive modeling and neural net training.

Cite

CITATION STYLE

APA

Grandvalet, Y. (1998). Least Absolute Shrinkage is Equivalent to Quadratic Penalization (pp. 201–206). https://doi.org/10.1007/978-1-4471-1599-1_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free