The support vector decomposition machine

  • Pereira F
  • Gordon G
  • 75

    Readers

    Mendeley users who have this article in their library.
  • 11

    Citations

    Citations of this article.

Abstract

In machine learning problems with tens of thousands of features and only dozens or hun- dreds of independent training examples, di- mensionality reduction is essential for good learning performance. In previous work, many researchers have treated the learning problem in two separate phases: first use an algorithm such as singular value decomposi- tion to reduce the dimensionality of the data set, and then use a classification algorithm such as naıve Bayes or support vector ma- chines to learn a classifier. We demonstrate that it is possible to combine the two goals of dimensionality reduction and classification into a single learning objective, and present a novel and efficient algorithm which optimizes this objective directly. We present experi- mental results in fMRI analysis which show that we can achieve better learning perfor- mance and lower-dimensional representations than two-phase approaches can.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • Francisco Pereira

  • Geoffrey Gordon

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free