Abstract
A well-known result on spectral variation of a Hermitian matrix due to Mirsky is the following: Let A A and A ~ \widetilde A be two n × n n\times n Hermitian matrices, and let λ 1 \lambda _1 , …, λ n \lambda _{n} and λ ~ 1 \widetilde \lambda _1 , …, λ ~ n \widetilde \lambda _{n} be their eigenvalues arranged in ascending order. Then ⦀ diag ( λ 1 − λ ~ 1 , … , λ n − λ ~ n ) ⦀ ≤ ⦀ A − A ~ ⦀ \left \Vvert \operatorname {diag} (\lambda _1- \widetilde \lambda _1,\ldots ,\lambda _n- \widetilde \lambda _n) \right \Vvert \le \left \Vvert A-\widetilde A \right \Vvert for any unitarily invariant norm ⦀ ⋅ ⦀ \Vvert \cdot \Vvert . In this paper, we generalize this to the perturbation theory for diagonalizable matrix pencils with real spectra. The much studied case of definite pencils is included in this.
Cite
CITATION STYLE
Bhatia, R., & Li, R.-C. (1996). On perturbations of matrix pencils with real spectra. II. Mathematics of Computation, 65(214), 637–645. https://doi.org/10.1090/s0025-5718-96-00699-0
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.