We give a theoretical explanation for the superlinear convergence behavior. observed while solving large symmetric systems of equations using the Conjugate Gradient (CG) method or other Krylov subspace methods. We present a new bound on the relative error after n iterations. This bound is valid in an asymptotic sense, when the size N of the system grows together with the number n of iterations. This bound depends on the asymptotic eigenvalue distribution and on the ratio n/N. Similar bounds are given for the task of approaching eigenvalues of large symmetric matrices via Ritz values. Our findings are related to some recent results concerning asymptotics of discrete orthogonal polynomials due to Rakhmanov and Dragnev & Saff, followed by many other authors. An important tool in these investigations is a constrained energy problem in logarithmic potential theory. The present notes are intended to be self contained (even if the proofs are sometimes incomplete and we refer to the original literature for details): the first part about Krylov subspace methods should be accessible for people from the orthogonal polynomial community and also for those who do not know much about numerical linear algebra. In the second part we gather the necessary tools from logarithmic potential theory and recall the basic results on the nth root asymptotics of discrete orthogonal polynomials. Finally, in the third part we discuss the fruitful relationship between these two fields and give several illustrating examples.
CITATION STYLE
Beckermann, B. (2006). Discrete orthogonal polynomials and superlinear convergence of krylov subspace methods in numerical linear algebra. Lecture Notes in Mathematics, 1883, 119–185. https://doi.org/10.1007/978-3-540-36716-1_3
Mendeley helps you to discover research relevant for your work.