In this paper we give a systematic analysis of the relationship between imbalance and overlap as factors influencing classifier performance. We demonstrate that these two factors have interdependent effects and that we cannot form a full understanding of their effects by considering them only in isolation. Although the imbalance problem can be considered a symptom of the small disjuncts problem which is solved by using larger training sets, the overlap problem is of a fundamentally different character and the performance of learned classifiers can actually be made worse by using more training data when overlap is present. We also examine the effects of overlap and imbalance on the complexity of the learned model and demonstrate that overlap is a far more serious factor than imbalance in this respect. © 2010 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Denil, M., & Trappenberg, T. (2010). Overlap versus imbalance. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6085 LNAI, pp. 220–231). https://doi.org/10.1007/978-3-642-13059-5_22
Mendeley helps you to discover research relevant for your work.