Structured large margin machines: Sensitive to data distributions

  • Yeung D
  • Wang D
  • Ng W
 et al. 
  • 18


    Mendeley users who have this article in their library.
  • 46


    Citations of this article.


This paper proposes a new large margin classifier—the structured large margin machine (SLMM)—that is sensitive to the structure of the data distribution. The SLMM approach incorporates the merits of “structured” learning models, such as radial basis function networks and Gaussian mixture models, with the advantages of “unstructured” large margin learning schemes, such as support vector machines and maxi-min margin machines. We derive the SLMM model from the concepts of “structured degree” and “homospace”, based on an analysis of existing structured and unstructured learning models. Then, by using Ward’s agglomerative hierarchical clustering on input data (or data mappings in the kernel space) to extract the underlying data structure, we formulate SLMM training as a sequential second order cone programming. Many promising features of the SLMM approach are illustrated, including its accuracy, scalability, extensibility, and noise tolerance. We also demonstrate the theoretical importance of the SLMM model by showing that it generalizes existing approaches, such as SVMs and M4s, provides novel insight into learning models, and lays a foundation for conceiving other “structured” classifiers. Editor:

Author-supplied keywords

  • Agglomerative hierarchical clustering
  • Homospace
  • Large margin learning
  • Second order cone programming (SOCP)
  • Structured learning
  • Weighted Mahalanobis distance (WMD)

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free