Bayesian group feature selection for support vector learning machines

2Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Group Feature Selection (GFS) has proven to be useful in improving the interpretability and prediction performance of learned model parameters in many machine learning and data mining applications. Existing GFS models were mainly based on square loss and logistic loss for regression and classification, leaving the ϵ-insensitive loss and the hinge loss popularized by Support Vector Learning (SVL) machines still unexplored. In this paper, we present a Bayesian GFS framework for SVL machines based on the pseudo likelihood and data augmentation idea. With Bayesian inference, our method can circumvent the cross-validation for regularization parameters. Specifically, we apply the mean field variational method in an augmented space to derive the posterior distribution of model parameters and hyper-parameters for Bayesian estimation. Both regression and classification experiments conducted on synthetic and real-world data sets demonstrate that our proposed approach outperforms a number of competitors.

Cite

CITATION STYLE

APA

Du, C., Du, C., Zhe, S., Luo, A., He, Q., & Long, G. (2016). Bayesian group feature selection for support vector learning machines. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9651, pp. 239–252). Springer Verlag. https://doi.org/10.1007/978-3-319-31753-3_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free