To address the imbalanced data problem in classification, the studies of the combination of AdaBoost, short for "Adaptive Boosting,"and cost-sensitive learning have shown convincing results in the literature. The cost-sensitive AdaBoost algorithms are practical since the "boosting"property in AdaBoost can iteratively enhance the small class of the cost-sensitive learning to solve the imbalanced data issue. However, the most available cost-sensitive AdaBoost algorithms are heuristic approaches, which are improved from the standard AdaBoost algorithm by cost-sensitively adjusting the voting weight parameters of weak classifiers or the sample updating weight parameters without strict theoretic proof. The algorithms are appended the cost-sensitive factors to focus on the high-cost and small-class samples, but they have no procedures to show the best place to add the cost factors and the cost factor value set. To complete the cost-sensitive AdaBoost algorithms' framework, the present article has two main contributions. First, we summarize the popular cost-sensitive boosting algorithms in the literature and propose a generally comprehensive form. We name our specific one, the "AdaImC algorithm,"which is typically appliable to solve the imbalanced data classification problem with theoretic proof. Second, a statistical approach to prove the AdaImC algorithm is proposed to verify the inner relationship between the cost parameters. We show that our proposed algorithm in the machine learning field is identical to the Product of Experts (PoE) model in the statistics field. Besides, a way to determine the cost parameter value by the statistical analysis is introduced. Several numeric studies are listed finally to support our proposed algorithm.
CITATION STYLE
Bei, H., Wang, Y., Ren, Z., Jiang, S., Li, K., & Wang, W. (2021). A Statistical Approach to Cost-Sensitive AdaBoost for Imbalanced Data Classification. Mathematical Problems in Engineering, 2021. https://doi.org/10.1155/2021/3165589
Mendeley helps you to discover research relevant for your work.