Parallel batch pattern training algorithm for MLP with two hidden layers on many-core system

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The development of parallel batch pattern back propagation training algorithm of multilayer perceptron with two hidden layers and the research of its parallelization efficiency on many-core system are presented in this paper. The model of multilayer perceptron and batch pattern training algorithm are theoretically described. The algorithmic description of the parallel batch pattern training method is presented. Our results show high parallelization efficiency of the developed algorithm on many-core parallel system with 48 CPUs using MPI technology. © Springer International Publishing Switzerland 2014.

Cite

CITATION STYLE

APA

Turchenko, V. (2014). Parallel batch pattern training algorithm for MLP with two hidden layers on many-core system. In Advances in Intelligent Systems and Computing (Vol. 290, pp. 537–544). Springer Verlag. https://doi.org/10.1007/978-3-319-07593-8_62

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free