Feedforward back propagation neural network (ffbpnn) based approach for the identification of handwritten math equations

8Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The demand for the identification of manually written mathematical equations is increasing day by day. Despite the hype, due to the increasing ambiguity in recognition, 2D and touching symbols, and complication mathematical equations, the recognition of the emerging mathematical equations has become a challenging task. The statistical, as well as complex features such as skew, kurtosis, entropy, mean, variance, standard deviation, has been considered. The classification and training have been provided using neural networks (NN) and the recognition rate has been dependent on the classifier used as well as features to be extracted. Speed of execution, efficiency and recognition rate have been enhanced by utilizing feed-forward back propagation neural network (FBBPNN) with training function gradient descent and learning rule of momentum and adaptive learning. The system can take scanned images of handwritten mathematical equations from simple through complex equations and classifies it according to the type of equations e.g. straight-line equation, the law of indices, gravity law, roots of quadratic expressions, area of a circle, convolution summation and convolution integration.

Cite

CITATION STYLE

APA

Shinde, S., Wadhwa, L., & Bhalke, D. (2021). Feedforward back propagation neural network (ffbpnn) based approach for the identification of handwritten math equations. In Advances in Intelligent Systems and Computing (Vol. 1200 AISC, pp. 757–775). Springer. https://doi.org/10.1007/978-3-030-51859-2_69

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free