Angle and position perception for exploration with active touch

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Over the past few decades the design of robots has gradually improved, allowing them to perform complex tasks in interaction with the world. To behave appropriately, robots need to make perceptual decisions about their environment using their various sensory modalities. Even though robots are being equipped with progressively more accurate and advanced sensors, dealing with uncertainties from the world and their sensory processes remains an unavoidable necessity for autonomous robotics. The challenge is to develop robust methods that allow robots to perceive their environment while managing uncertainty and optimizing their decision making. These methods can be inspired by the way humans and animals actively direct their senses towards locations for reducing uncertainties from perception [1]. For instance, humans not only use their hands and fingers for exploration and feature extraction but also their movements are guided according to what it is being perceived [2]. This behaviour is also present in the animal kingdom, such as rats that actively explore the environment by appropriately moving their whiskers [3]. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Martinez-Hernandez, U., Dodd, T. J., Prescott, T. J., & Lepora, N. F. (2013). Angle and position perception for exploration with active touch. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8064 LNAI, pp. 405–408). Springer Verlag. https://doi.org/10.1007/978-3-642-39802-5_49

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free