Online DC optimization for online binary linear classification

5Citations
Citations of this article
253Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper concerns online algorithms for online binary linear classification (OBLC) problems in Machine learning. In a sense of “online” classification, an instance sequence is given step by step and on each round, these problems consist in finding a linear classifier for predicting to which label a new instance belongs. In OBCL, the quality of predictions is assessed by a loss function, specifically 0–1 loss function. In fact, this loss function is nonconvex, nonsmooth and thus, such problems become intractable. In literature, Perceptron is a well-known online classification algorithm, in which one substitutes a surrogate convex loss function for the 0–1 loss function. In this paper, we investigate an efficient DC loss function which is a suitable approximation of the usual 0–1 loss function. Basing on Online DC (Difference of Convex functions) programming and Online DCA (DC Algorithms) [10], we develop an online classification algorithm. Numerical experiments on several test problems show the efficiency of our proposed algorithm with respect to Perceptron.

Cite

CITATION STYLE

APA

Thanh, H. V., An, L. T. H., & Chien, B. D. (2016). Online DC optimization for online binary linear classification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9622, pp. 661–670). Springer Verlag. https://doi.org/10.1007/978-3-662-49390-8_64

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free