Local Decorrelation For Improved Detection

  • Nam W
  • Dollár P
  • Han J
  • 163


    Mendeley users who have this article in their library.
  • 101


    Citations of this article.


Even with the advent of more sophisticated, data-hungry methods, boosted decision trees remain extraordinarily successful for fast rigid object detection, achieving top accuracy on numerous datasets. While effective, most boosted detectors use decision trees with orthogonal (single feature) splits, and the topology of the resulting decision boundary may not be well matched to the natural topology of the data. Given highly correlated data, decision trees with oblique (multiple feature) splits can be effective. Use of oblique splits, however, comes at considerable computational expense. Inspired by recent work on discriminative decorrelation of HOG features, we instead propose an efficient feature transform that removes correlations in local neighborhoods. The result is an overcomplete but locally decorrelated representation ideally suited for use with orthogonal decision trees. In fact, orthogonal trees with our locally decorrelated features outperform oblique trees trained over the original features at a fraction of the computational cost. The overall improvement in accuracy is dramatic: on the Caltech Pedestrian Dataset, we reduce false positives nearly tenfold over the previous state-of-the-art.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

  • ISSN: 10495258
  • PUI: 605375324
  • SGR: 84937921067
  • arXiv: 1406.1134
  • SCOPUS: 2-s2.0-84937921067


  • Woonhyun Nam

  • Piotr Dollár

  • Joon Hee Han

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free