Cost-sensitive alternating direction method of multipliers for large-scale classification

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Large-scale classification is one of the most significant topics in machine learning. However, previous classification methods usually require the assumption that the data has a balanced class distribution. Thus, when dealing with imbalanced data, these methods often present performance degradation. In order to seek the better performance in large-scale classification, we propose a novel Cost-Sensitive Alternating Direction Method of Multipliers method (CSADMM) to deal with imbalanced data in this paper. CSADMM derives the problem into a series of subproblems efficiently solved by a dual coordinate descent method in parallel. In particular, CSADMM incorporates different classification costs for large-scale imbalanced classification by cost-sensitive learning. Experimental results on several large-scale imbalanced datasets show that compared with distributed random forest and fuzzy rule based classification system, CSADMM obtains better classification performance, with the training time is significantly reduced. Moreover, compared with single-machine methods, CSADMM also shows promising results.

Cite

CITATION STYLE

APA

Wang, H., Shi, Y., Chen, X., & Gao, Y. (2017). Cost-sensitive alternating direction method of multipliers for large-scale classification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10585 LNCS, pp. 315–325). Springer Verlag. https://doi.org/10.1007/978-3-319-68935-7_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free