A Robust Variable Selection Method for Sparse Online Regression via the Elastic Net Penalty

17Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

Abstract

Variable selection has been a hot topic, with various popular methods including lasso, SCAD, and elastic net. These penalized regression algorithms remain sensitive to noisy data. Furthermore, “concept drift” fundamentally distinguishes streaming data learning from batch learning. This article presents a method for noise-resistant regularization and variable selection in noisy data streams with multicollinearity, dubbed canal-adaptive elastic net, which is similar to elastic net and encourages grouping effects. In comparison to lasso, the canal adaptive elastic net is especially advantageous when the number of predictions (p) is significantly larger than the number of observations (n), and the data are multi-collinear. Numerous simulation experiments have confirmed that canal-adaptive elastic net has higher prediction accuracy than lasso, ridge regression, and elastic net in data with multicollinearity and noise.

Cite

CITATION STYLE

APA

Wang, W., Liang, J., Liu, R., Song, Y., & Zhang, M. (2022). A Robust Variable Selection Method for Sparse Online Regression via the Elastic Net Penalty. Mathematics, 10(16). https://doi.org/10.3390/math10162985

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free