Recent years, multi-label feature selection has gradually attracted significant attentions from machine learning, statistical computing and related communities and has been widely applied to diverse problems from music recognition to text mining, image annotation, etc. However, traditional multi-label feature selection methods employ cumulative summation strategy to design methods, which suffers from the problem of overestimation the redundancy of some candidate features. Additionally, the cumulative summation strategy will lead to the high value of the goal function when one candidate feature is completed related with one or few already-selected features, but it is almost independent from the majority of already-selected features. To address these issues, we propose a new multi-label feature selection method named Feature Redundancy Maximization (FRM), which combines the cumulative summation of conditional mutual information with the 'maximum of the minimum' criterion. Additionally, FRM can be rewritten as another form of multi-label feature selection method that employs interaction information as a measure of feature redundancy that obtains an accurate score of feature redundancy as the number of already-selected features increases. Finally, extensive experiments are implemented on fourteen benchmark multi-label data sets in comparison to six state-of-the-art methods. The experimental results demonstrate the superiority of the proposed method.
CITATION STYLE
Gao, W., Hu, J., Li, Y., & Zhang, P. (2020). Feature Redundancy Based on Interaction Information for Multi-Label Feature Selection. IEEE Access, 8, 146050–146064. https://doi.org/10.1109/ACCESS.2020.3015755
Mendeley helps you to discover research relevant for your work.