Comparative study of the CG and HBF ODEs used in the global minimization of nonconvex functions

3Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents a unified control Liapunov function (CLF) approach to the design of heavy ball with friction (HBF) and conjugate gradient (CG) neural networks that aim to minimize scalar nonconvex functions that have continuous first- and second-order derivatives and a unique global minimum. This approach leads naturally to the design of second-order differential equations which are the mathematical models of the corresponding implementations as neural networks. Preliminary numerical simulations indicate that, on a small suite of benchmark test problems, a continuous version of the well known conjugate gradient algorithm, designed by the proposed CLF method, has better performance than its HBF competitor. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Bhaya, A., Pazos, F. A., & Kaszkurewicz, E. (2009). Comparative study of the CG and HBF ODEs used in the global minimization of nonconvex functions. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5768 LNCS, pp. 668–677). https://doi.org/10.1007/978-3-642-04274-4_69

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free