Second order back propagation neural network (SOBPNN) algorithm for medical data classification

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Gradient based methods are one of the most widely used error minimization methods used to train back propagation neural networks (BPNN). Some second order learning methods deal with a quadratic approximation of the error function determined from the calculation of the Hessian matrix, and achieves improved convergence rates in many cases. This paper introduces an improved second order back propagation which calculates efficiently the Hessian matrix by adaptively modifying the search direction. This paper suggests a simple modification to the initial search direction, i.e. the gradient of error with respect to weights, can substantially improve the training efficiency. The efficiency of the proposed SOBPNN is verified by means of simulations on five medical data classification. The results show that the SOBPNN significantly improves the learning performance of BPNN.

Cite

CITATION STYLE

APA

Nawi, N. M., Hamid, N. A., Harsad, N., & Ramli, A. A. (2015). Second order back propagation neural network (SOBPNN) algorithm for medical data classification. In Advances in Intelligent Systems and Computing (Vol. 331, pp. 73–83). Springer Verlag. https://doi.org/10.1007/978-3-319-13153-5_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free