In on-line gradient descent learning, the local property of the derivative of the output function can cause slow convergence. This phenomenon, called a plateau, occurs in the learning process of the multilayer network. Improving the derivative term, we employ the proposed method replacing the derivative term with a constant that greatly increases the relaxation speed. Moreover, we replace the derivative term with the 2nd order of expansion of the derivative, and it beaks a plateau faster than the original method. © 2014 Springer International Publishing.
CITATION STYLE
Hara, K., & Katahira, K. (2014). Soft committee machine using simple derivative term. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8467 LNAI, pp. 59–66). Springer Verlag. https://doi.org/10.1007/978-3-319-07173-2_6
Mendeley helps you to discover research relevant for your work.