Conjugate Direction Methods for Quadratic Problems

N/ACitations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The problem of solving a set of linear equations with a symmetric positive definite matrix is equivalent to the problem of minimizing a quadratic function. Consider the problem of finding x ∈ R n satisfying Ax = b, where A ∈ R n×n , b ∈ R n and A is symmetric positive definite. The solution to this problem is also a solution of the optimization problem (P): min x∈R n f (x) = 1 2 x T Ax − b T x. (1.1) Consider the point ¯ x such that ∇ f (¯ x) = g(¯ x) = A ¯ x − b = 0. (1.2) We can show that (1.2) are the necessary optimality conditions for problem (1.1). Lemma 1.1. Suppose that A is a symmetric positive definite matrix. If ¯ x solves the problem (1.1) then (1.2) holds. Proof. Assume that g(¯ x) = ¯ r = 0 and evaluate f at the point ¯ x − α ¯ r, where α is some positive number: f (¯ x − α ¯ r) = 1 2 (¯ x − α ¯ r) T A (¯ x − α ¯ r) − b T (¯ x − α ¯ r) = 1 2 ¯ x T A ¯ x − α ¯ x T A¯ r + 1 2 α 2 ¯ r T A¯ r − b T ¯ x + αb T ¯ r

Cite

CITATION STYLE

APA

Conjugate Direction Methods for Quadratic Problems. (2008). In Conjugate Gradient Algorithms in Nonconvex Optimization (pp. 1–62). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-85634-4_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free