Determining regularization parameters for derivative free neural learning

3Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Derivative free optimisation methods have recently gained a loi of attractions for neural learning. The curse of dimensionality for the neural learning problem makes local optimization methods very attractive; however the error surface contains many local minima. Discrete gradient method is a special case of derivative free methods based on bundle methods and has the ability lo jump over many local minima. There are two types of problems that are associated with this when local optimization methods are used for neural learning. The first type of problems is initial sensitivity dependence problem - that is commonly solved by using a hybrid model. Our early research has shown that discrete gradient method combining with other global methods such as evolutionary algorithm makes them even more attractive. These types of hybrid models have been studied by other researchers also. Another less mentioned problem is the problem of large weight values for the synaptic connections of the network. Large synaptic weight values often lead to the problem of paralysis and convergence problem especially when a hybrid model is used for fine tuning the learning task. In this paper we study and analyse the effect of different regularization parameters for our objective function to restrict the weight values without compromising the classification accuracy. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Ghosh, R., Ghosh, M., Yearwood, J., & Bagirov, A. (2005). Determining regularization parameters for derivative free neural learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3587 LNAI, pp. 71–79). Springer Verlag. https://doi.org/10.1007/11510888_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free