Abstract
The central point of attention for this paper is weight trimming — a technique known for speeding up boosted learning procedures. The loss of accuracy introduced by the technique is typically negligible. Recently, an elegant algorithm has been proposed by Appel et al.: it applies weight trimming under AdaBoost, prunes some features using a special error bound, but simultanouesly guarantees the same outcome (ensemble of trees with exactly the same parameters) as if with no trimming. Thus, no loss of training accuracy occurs. In this paper, we supplement the idea by Appel with a suitable extension for real-boosting. We prove that this approach gives the same outcome guarantees, both for stumps and trees. Additionally, we analyze the complexity of Appel’s idea and we show that in some cases it may lead to computational losses.
Cite
CITATION STYLE
Klęsk, P. (2016). Quick Real-Boost with: Weight Trimming, Exponential Impurity, Bins, and Pruning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9692, pp. 597–609). Springer Verlag. https://doi.org/10.1007/978-3-319-39378-0_51
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.