Hidden-layer size reducing for multilayer neural networks using the orthogonal least-squares method

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

This paper proposes a new approach to hidden-layer size reducing for multilayer neural networks, using the orthogonal least-squares (OLS) method with the aid of Gram-Schmidt orthogonal transformation. A neural network with a large hidden-layer size is first trained via a standard training rule. Then the OLS method is introduced to identify and eliminate redundant neurons such that a simpler neural network is obtained. The OLS method is employed as a forward regression procedure to select a suitable set of neurons from a large set of preliminarily trained hidden neurons, such that the input to the output-layer's neuron is reconstructed with less hidden neurons. Simulation results are included to show the efficiency of the proposed method.

Cite

CITATION STYLE

APA

Yang, Z. J. (1997). Hidden-layer size reducing for multilayer neural networks using the orthogonal least-squares method. In Proceedings of the SICE Annual Conference (pp. 1089–1092). Society of Instrument and Control Engineers (SICE). https://doi.org/10.9746/sicetr1965.33.216

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free