Evaluation of a novel GA-based methodology for model structure selection: The GA-PARSIMONY

N/ACitations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Most proposed metaheuristics for feature selection and model parameter optimization are based on a two-termed Loss+Penalty function. Their main drawback is the need of a manual set of the parameter that balances between the loss and the penalty term. In this paper, a novel methodology referred as the GA-PARSIMONY and specifically designed to overcome this issue is evaluated in detail in thirteen public databases with five regression techniques. It is a GA-based meta-heuristic that splits the classic two-termed minimization functions by making two consecutive ranks of individuals. The first rank is based solely on the generalization error, while the second (named ReRank) is based on the complexity of the models, giving a special weight to the complexity entailed by large number of inputs. For each database, models with lowest testing RMSE and without statistical difference among them were referred as winner models. Within this group, the number of features selected was below 50%, which proves an optimal balance between error minimization and parsimony. Particularly, the most complex algorithms (MLP and SVR) were mostly selected in the group of winner models, while using around40–45% of the available attributes. The most basic IBk, ridge regression (LIN) and M5P were only classified as winner models in the simpler databases, but using less number of features in those cases (up to a 20–25% of the initial inputs).

Cite

CITATION STYLE

APA

Urraca, R., Sodupe-Ortega, E., Antonanzas, J., Antonanzas-Torres, F., & Martinez-de-Pison, F. J. (2018). Evaluation of a novel GA-based methodology for model structure selection: The GA-PARSIMONY. Neurocomputing, 271, 9–17. https://doi.org/10.1016/j.neucom.2016.08.154

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free