In a multi-label dataset, an instance is given a single representation across all possible labels. Despite the mutual sharing of instances among the labels, the membership of the instances vary from label to label. This diversifies the intrinsic class geometries of the labels. Multi-label datasets are often found to be class-imbalanced as well. The varying membership of the instances coupled with the imbalance phenomenon gives rise to varying imbalance ratios across the labels. We address these two key aspects in this work, Lattice and Imbalance Informed Multi-label Learning (LIIML) in a two step procedure. Firstly, we obtain the imbalance ratios and the intrinsic positive and negative class lattices of each label. We capitalize on these two information to obtain a dedicated feature set for each label. In the second step, to handle the class-imbalance further, we employ a scheme of imbalance-adaptive misclassification cost across the labels. We have evaluated the competence of the proposed method in a generic as well as class-imbalanced framework. The elaborate empirical study establishes the competence of the proposed method in both the contexts.
CITATION STYLE
Sadhukhan, P., & Palit, S. (2020). Lattice and imbalance informed multi-label learning. IEEE Access, 8, 7394–7407. https://doi.org/10.1109/ACCESS.2019.2962201
Mendeley helps you to discover research relevant for your work.