The common vector approach and its relation to principal component analysis

  • Bilginer Gülmezoǧlu M
  • Dzhafarov V
  • Barkana A
  • 15

    Readers

    Mendeley users who have this article in their library.
  • 96

    Citations

    Citations of this article.

Abstract

The main point of the paper is to show the close re-
lation between the nonzero principal components and the differ-
ence subspace together with the complementary close relation be-
tween the zero principal components and the common vector. A
common vector representing each word-class is obtained from the
eigenvectors of the covariance matrix of its own word-class; that
is, the common vector is in the direction of a linear combination
of the eigenvectors corresponding to the zero eigenvalues of the co-
variance matrix. The methods that use the nonzero principal com-
ponents for recognition purposes suggest the elimination of all the
features that are in the direction of the eigenvectors corresponding
to the smallest eigenvalues (including the zero eigenvalues) of the
covariance matrix whereas the common vector approach suggests
the elimination of all the features that are in the direction of the
eigenvectors corresponding to the largest, all nonzero eigenvalues
of the covariance matrix.

Author-supplied keywords

  • Common vector approach
  • Speech recognition
  • Subspace methods

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • M. Bilginer Gülmezoǧlu

  • Vakif Dzhafarov

  • Atalay Barkana

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free