Towards a Discrete Newton Method with Memory for Large-Scale Optimization

  • Byrd R
  • Nocedal J
  • Zhu C
N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A new method for solving large nonlinear optimization problems is outlined. It attempts to combine the best properties of the discrete-truncated Newton method and the limited memory BFGS method, to produce an algorithm that is both economical and capable of handling ill-conditioned problems. The key idea is to use the curvature information generated during the computation of the discrete Newton step to improve the limited memory BFGS approximations. The numerical performance of the new method is studied using a family of functions whose nonlinearity and condition number can be controlled.

Cite

CITATION STYLE

APA

Byrd, R. H., Nocedal, J., & Zhu, C. (1996). Towards a Discrete Newton Method with Memory for Large-Scale Optimization. In Nonlinear Optimization and Applications (pp. 1–12). Springer US. https://doi.org/10.1007/978-1-4899-0289-4_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free