Hierarchical Class-Based Curriculum Loss

2Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

Classification algorithms in machine learning often assume a flat label space. However, most real world data have dependencies between the labels, which can often be captured by using a hierarchy. Utilizing this relation can help develop a model capable of satisfying the dependencies and improving model accuracy and interpretability. Further, as different levels in the hierarchy correspond to different granularities, penalizing each label equally can be detrimental to model learning. In this paper, we propose a loss function, hierarchical curriculum loss, with two properties: (i) satisfy hierarchical constraints present in the label space, and (ii) provide non-uniform weights to labels based on their levels in the hierarchy, learned implicitly by the training paradigm. We theoretically show that the proposed hierarchical class-based curriculum loss is a tight bound of 0-1 loss among all losses satisfying the hierarchical constraints. We test our loss function on real world image data sets, and show that it significantly outperforms state-of-the-art baselines.

Cite

CITATION STYLE

APA

Goyal, P., Choudhary, D., & Ghosh, S. (2021). Hierarchical Class-Based Curriculum Loss. In IJCAI International Joint Conference on Artificial Intelligence (pp. 2448–2454). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2021/337

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free