A new method for solving large nonlinear optimization problems is outlined. It attempts to combine the best properties of the discrete-truncated Newton method and the limited memory BFGS method, to produce an algorithm that is both economical and capable of handling ill-conditioned problems. The key idea is to use the curvature information generated during the computation of the discrete Newton step to improve the limited memory BFGS approximations. The numerical performance of the new method is studied using a family of functions whose nonlinearity and condition number can be controlled.
CITATION STYLE
Byrd, R. H., Nocedal, J., & Zhu, C. (1996). Towards a Discrete Newton Method with Memory for Large-Scale Optimization. In Nonlinear Optimization and Applications (pp. 1–12). Springer US. https://doi.org/10.1007/978-1-4899-0289-4_1
Mendeley helps you to discover research relevant for your work.