Parametric optimization in data mining incorporated with GA-based search

2Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

A number of parameters must be specified for a data-mining algorithm. Default values of these parameters are given and generally accepted as 'good' estimates for any data set. However, data mining models are known to be data dependent, and so are for their parameters. Default values may be good estimates, but they are often not the best parameter values for a particular data set. A tuned set of parameter values is able to produce a data-mining model of better classification and higher prediction accuracy. However parameter search is known to be expensive. This paper investigates GA-based heuristic techniques in a case study of optimizing parameters of back-propagation neural network classifier. Our experiments show that GA-based optimization technique is capable of finding a better set of parameter values than random search. In addition, this paper extends the island-model of Parallel GA (PGA) and proposes a VC-PGA, which communicates globally fittest individuals to local population with reduced communication overhead. Our result shows that GA-based parallel heuristic optimization technique provides a solution to large parametric optimization problems. © 2002 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Tam, L., Taniar, D., & Smith, K. (2002). Parametric optimization in data mining incorporated with GA-based search. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2329 LNCS, pp. 582–591). Springer Verlag. https://doi.org/10.1007/3-540-46043-8_59

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free