In this paper, we investigate one-class and clustering problems by using statistical learning theory. To establish a universal framework, a unsupervised learning problem with predefined threshold η is formally described and the intuitive margin is introduced. Then, one-class and clustering problems are formulated as two specific η-unsupervised problems. By defining a specific hypothesis space in η-one-class problems, the crucial minimal sphere algorithm for regular one-class problems is proved to be a maximum margin algorithm. Furthermore, some new one-class and clustering marginal algorithms can be achieved in terms of different hypothesis spaces. Since the nature in SVMs is employed successfully, the proposed algorithms have robustness, flexibility and high performance. Since the parameters in SVMs are interpretable, our unsupervised learning framework is clear and natural. To verify the reasonability of our formulation, some synthetic and real experiments are conducted. They demonstrate that the proposed framework is not only of theoretical interest, but they also has a legitimate place in the family of practical unsupervised learning techniques. © Springer-Verlag Berlin Heidelberg 2005.
CITATION STYLE
Tao, Q., Wu, G. W., Wang, F. Y., & Wang, J. (2005). Some marginal learning algorithms for unsupervised problems. In Lecture Notes in Computer Science (Vol. 3495, pp. 395–401). Springer Verlag. https://doi.org/10.1007/11427995_34
Mendeley helps you to discover research relevant for your work.