Global Optimisation of Neural Networks Using a Deterministic Hybrid Approach

  • Beliakov G
  • Abraham A
N/ACitations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Selection of the topology of a neural network and correct parameters for the learning algorithm is a tedious task for designing an optimal artificial neural network, which is smaller, faster and with a better generalization performance. In this paper we introduce a recently developed cutting angle method (a deterministic technique) for global optimization of connection weights. Neural networks are initially trained using the cutting angle method and later the learning is fine-tuned (meta-learning) using conventional gradient descent or other optimization techniques. Experiments were carried out on three time series benchmarks and a comparison was done using evolutionary neural networks. Our preliminary experimentation results show that the proposed deterministic approach could provide near optimal results much faster than the evolutionary approach

Cite

CITATION STYLE

APA

Beliakov, G., & Abraham, A. (2002). Global Optimisation of Neural Networks Using a Deterministic Hybrid Approach. In Hybrid Information Systems (pp. 79–92). Physica-Verlag HD. https://doi.org/10.1007/978-3-7908-1782-9_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free