Large margin DragPushing strategy for centroid text categorization

11Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Among all conventional methods for text categorization, centroid classifier is a simple and efficient method. However it often suffers from inductive bias (or model misfit) incurred by its assumption. DragPushing is a very simple and yet efficient method to address this so-called inductive bias problem. However, DragPushing employs only one criterion, i.e., training-set error, as its objective function that cannot guarantee the generalization capability. In this paper, we propose a generalized DragPushing strategy for centroid classifier, which we called as "Large Margin DragPushing" (LMDP). The experiments conducted on three benchmark evaluation collections show that LMDP achieved about one percent improvement over the performance of DragPushing and delivered top performance nearly as well as state-of-the-art SVM without incurring significant computational costs. © 2006.

Cite

CITATION STYLE

APA

Tan, S. (2007). Large margin DragPushing strategy for centroid text categorization. Expert Systems with Applications, 33(1), 215–220. https://doi.org/10.1016/j.eswa.2006.04.008

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free