In this paper, we propose that active perception will help attain autonomous robotics in unstructured environments by giving robust perception. We test this claim with a biomimetic fingertip that senses surface texture under a range of contact depths. We compare the performance of passive Bayesian perception with a novel approach for active perception that includes a sensorimotor loop for controlling sensor position. Passive perception at a single depth gave poor results, with just 0.2mm uncertainty impairing performance. Extending passive perception over a range of depths gave non-robust performance. Only active perception could give robust, accurate performance, with the sensorimotor feedback compensating the position uncertainty. We expect that these results will extend to other stimuli, so that active perception will offer a general approach to robust perception in unstructured environments. © 2013 IEEE.
CITATION STYLE
Lepora, N. F., Martinez-Hernandez, U., & Prescott, T. J. (2013). Active touch for robust perception under position uncertainty. In Proceedings - IEEE International Conference on Robotics and Automation (pp. 3020–3025). https://doi.org/10.1109/ICRA.2013.6630996
Mendeley helps you to discover research relevant for your work.