Learning and evolution by minimization of mutual information

11Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Based on negative correlation learning [1] and evolutionary learning, evolutionary ensembles with negative correlation learning (EENCL) was proposed for learning and designing of neural network ensembles [2]. The idea of EENCL is to regard the population of neural networks as an ensemble, and the evolutionary process as the design of neural network ensembles. EENCL used a fitness sharing based on the covering set. Such fitness sharing did not make accurate measurement on the similarity in the population. In this paper, a fitness sharing scheme based on mutual information is introduced in EENCL to evolve a diverse and cooperative population. The effectiveness of such evolutionary learning approach was tested on two real-world problems. This paper has also analyzed negative correlation learning in terms of mutual information on a regression task in the different noise conditions.

Cite

CITATION STYLE

APA

Liu, Y., & Yao, X. (2002). Learning and evolution by minimization of mutual information. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2439, pp. 495–504). Springer Verlag. https://doi.org/10.1007/3-540-45712-7_48

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free