This paper analyzes the effect of momentum on steepest descent training for quadratic performance functions. Some global convergence conditions of the steepest descent algorithm are obtained by directly analyzing the exact momentum equations for quadratic cost functions. Those conditions can be directly derived from the parameters (different from eigenvalues that are used in the existed ones.) of the Hessian matrix. The results presented in this paper are new. © Springer-Verlag Berlin Heidelberg 2004.
CITATION STYLE
Zeng, Z., Huang, D. S., & Wang, Z. (2004). Global convergence of steepest descent for quadratic functions. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3177, 672–677. https://doi.org/10.1007/978-3-540-28651-6_99
Mendeley helps you to discover research relevant for your work.