Comparison of Artificial Neural Network Architecture in Solving Ordinary Differential Equations

  • Mall S
  • Chakraverty S
Citations of this article
Mendeley users who have this article in their library.


This paper investigates the solution of Ordinary Differential Equations (ODEs) with initial conditions using Regression Based Algorithm (RBA) and compares the results with arbitrary- and regression-based initial weights for different numbers of nodes in hidden layer. Here, we have used feed forward neural network and error back propagation method for minimizing the error function and for the modification of the parameters (weights and biases). Initial weights are taken as combination of random as well as by the proposed regression based model. We present the method for solving a variety of problems and the results are compared. Here, the number of nodes in hidden layer has been fixed according to the degree of polynomial in the regression fitting. For this, the input and output data are fitted first with various degree polynomials using regression analysis and the coefficients involved are taken as initial weights to start with the neural training. Fixing of the hidden nodes depends upon the degree of the polynomial. For the example problems, the analytical results have been compared with neural results with arbitrary and regression based weights with four, five, and six nodes in hidden layer and are found to be in good agreement.




Mall, S., & Chakraverty, S. (2013). Comparison of Artificial Neural Network Architecture in Solving Ordinary Differential Equations. Advances in Artificial Neural Systems, 2013, 1–12.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free