Distribution-Balanced Loss for Multi-label Classification in Long-Tailed Datasets

126Citations
Citations of this article
268Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a new loss function called Distribution-Balanced Loss for the multi-label recognition problems that exhibit long-tailed class distributions. Compared to conventional single-label classification problem, multi-label recognition problems are often more challenging due to two significant issues, namely the co-occurrence of labels and the dominance of negative labels (when treated as multiple binary classification problems). The Distribution-Balanced Loss tackles these issues through two key modifications to the standard binary cross-entropy loss: 1) a new way to re-balance the weights that takes into account the impact caused by label co-occurrence, and 2) a negative tolerant regularization to mitigate the over-suppression of negative labels. Experiments on both Pascal VOC and COCO show that the models trained with this new loss function achieve significant performance gains over existing methods. Code and models are available at: https://github.com/wutong16/DistributionBalancedLoss.

Cite

CITATION STYLE

APA

Wu, T., Huang, Q., Liu, Z., Wang, Y., & Lin, D. (2020). Distribution-Balanced Loss for Multi-label Classification in Long-Tailed Datasets. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12349 LNCS, pp. 162–178). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-58548-8_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free