Effective learning with heterogeneous neural networks

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper introduces a class of neuron models accepting heterogeneous inputs and weights. The neuron model computes a user- defined similarity function between inputs and weights. The neuron transfer function is formed by composition of an adapted logistic function with the power mean of the partial input-weight similarities. The resulting neuron model is capable of dealing directly with mixtures of continuous quantities (crisp or fuzzy) and discrete quantities (ordinal, integer, binary or nominal). There is also provision for missing values. An artificial neural network using these neuron models is trained using a breeder genetic algorithm until convergence. A number of experiments are carried out using several real-world benchmarking problems. The network is compared to a standard radial basis function network and to a multi-layer perceptron and shown to learn from non-trivial data sets with superior generalization ability in most cases, at a comparable computational cost. A further advantage is the interpretability of the learned weights. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Belanche-Muñoz, L. A. (2008). Effective learning with heterogeneous neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4984 LNCS, pp. 328–337). https://doi.org/10.1007/978-3-540-69158-7_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free