Feature extraction from optimization data via DataModeler's ensemble symbolic regression

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We demonstrate a means of knowledge discovery through feature extraction that exploits the search history of an optimization run. We regress a symbolic model ensemble from optimization run search points and their objective scores. The frequency of a variable in the models of the ensemble indicates to what the extent it is an influential feature. Our demonstration uses a genetic programming symbolic regression software package that is designed to be "off-the-shelf". By default, the only parameter needed in order to evolve a suite of models is how long the user is willing to wait. Then the user can easily specify which models should go forward in terms of sufficient accuracy and complexity. For illustration purposes, we consider a common design heuristic in serial sensor sequencing: "place the most reliable sensor last". The heuristic is derived based on the mathematical form of the objective function that lays emphasis on the decision variable pertaining to the last sensor. Feature extraction on optimized sensor sequences indicates that the heuristic is usually effective though it is not always trustworthy. This is consistent with knowledge in sensor processing. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Veeramachaneni, K., Vladislavleva, K., & O’Reilly, U. M. (2010). Feature extraction from optimization data via DataModeler’s ensemble symbolic regression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6073 LNCS, pp. 251–265). https://doi.org/10.1007/978-3-642-13800-3_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free