Gesture recognition for human-robot interaction is a prerequisite for many social robotic tasks. One of the main technical difficulties is hand tracking in crowded and dynamic environments. Many existing methods have only been shown to work in clutter-free settings. This paper proposes a sensor fusion based hand tracking algorithm for crowded environments. It is shown to significantly improve the accuracy of existing hand detectors, based on depth and RGB information. The main novelties of the proposed method include: a) a Monte-Carlo RGB update process to reduce false positives; b) online skin colour learning to cope with varying skin colour, clothing and illumination conditions; c) an asynchronous update method to integrate depth and RGB information for real-time applications. Tracking performance is evaluated in a number of controlled scenarios and crowded environments. All datasets used in this work have been made publicly available. © Springer International Publishing 2013.
CITATION STYLE
McKeague, S., Liu, J., & Yang, G. Z. (2013). An asynchronous RGB-D sensor fusion framework using Monte-Carlo methods for hand tracking on a mobile robot in crowded environments. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8239 LNAI, pp. 491–500). https://doi.org/10.1007/978-3-319-02675-6_49
Mendeley helps you to discover research relevant for your work.