Differential-weighted global optimum of BP neural network on image classification

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper investigates the problem of image classification with limited or no annotations, but abundant unlabelled data. We propose the DBP (Differential-weighted Global Optimum of BP Neural Network) to make the performance of the BP Neural Network to become more stable. In details, the optimal weights will be saved as potential global optimum during the process of iteration and then we combine the BP Neural Network with the potential global weights to adjust parameters in the backward feedback process for the first time. As the model has fallen into local optimization, we replace the present parameters with the potential global optimal weights to optimize our model. Besides, we consider EP, CNN, SIFT image features and conduct several experiments on eight standard datasets. The results show that DBP mostly outperforms other supervised and semi-supervised learning methods in the state of the art.

Cite

CITATION STYLE

APA

Ma, L., Lin, X., & Jiang, L. (2017). Differential-weighted global optimum of BP neural network on image classification. In Lecture Notes in Electrical Engineering (Vol. 424, pp. 544–552). Springer Verlag. https://doi.org/10.1007/978-981-10-4154-9_63

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free