Small-variance asymptotics for Dirichlet process mixtures of SVMs

5Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Infinite SVM (iSVM) is a Dirichlet process (DP) mixture of large-margin classifiers. Though flexible in learning nonlinear classifiers and discovering latent clustering structures, iSVM has a difficult inference task and existing methods could hinder its applicability to large-scale problems. This paper presents a smallvariance asymptotic analysis to derive a simple and efficient algorithm, which monotonically optimizes a maxmargin DP-means (M2 DPM) problem, an extension of DP-means for both predictive learning and descriptive clustering. Our analysis is built on Gibbs infinite SVMs, an alternative DP mixture of large-margin machines, which admits a partially collapsed Gibbs sampler without tmncation by exploring data augmentation techniques. Experimental results show that M2 DPM runs much faster than similar algorithms without sacrificing prediction accuracies.

Cite

CITATION STYLE

APA

Wang, Y., & Zhu, J. (2014). Small-variance asymptotics for Dirichlet process mixtures of SVMs. In Proceedings of the National Conference on Artificial Intelligence (Vol. 3, pp. 2135–2141). AI Access Foundation. https://doi.org/10.1609/aaai.v28i1.8959

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free