The Singular Value Decomposition in Multivariate Statistics

  • Hammarling S
  • 3

    Readers

    Mendeley users who have this article in their library.
  • N/A

    Citations

    Citations of this article.

Abstract

Many multivariate techniques in statistics are described in terms of an appropriate sums of squares and cross products matrix, such as a covariance matrix, or a correlation matrix, rather than in terms of the original data matrix. While this is frequently the best way of understanding and analysing a technique, it is not necessarily the most satisfactory approach for implementing the technique computationally. From a numerical point of view, it is usually better to work with the data matrix. This is a review article aimed at the statistician and mathematician who, while not being expert numerical analysts, would like to gain some understanding of why it is better to work with the data matrix, and of the techniques that allow us to avoid the explicit computation of sums and squares and cross products matrices. To give a focus and to keep the article of moderate length, we concentrate in particular on the use of the singular value decomposition and its application to multiple regression problems. In the final two sections we give a brief discussion of principal components, canonical correlations and the generalized singular value decomposition.

Author-supplied keywords

  • 15 linear multilinear algebra; matrix theory
  • 62 statistics
  • 65 numerical analysis

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

There are no full text links

Authors

  • Sven Hammarling

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free