Gaze allocation analysis for a visually guided manipulation task

3Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Findings from eye movement research in humans have demonstrated that the task determines where to look. One hypothesis is that the purpose of looking is to reduce uncertainty about properties relevant to the task. Following this hypothesis, we define a model that poses the problem of where to look as one of maximising task performance by reducing task relevant uncertainty. We implement and test our model on a simulated humanoid robot which has to move objects from a table into containers. Our model outperforms and is more robust than two other baseline schemes in terms of task performance whilst varying three environmental conditions, reach/grasp sensitivity, observation noise and the camera's field of view. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Nunez-Varela, J., Ravindran, B., & Wyatt, J. L. (2012). Gaze allocation analysis for a visually guided manipulation task. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7426 LNAI, pp. 44–53). https://doi.org/10.1007/978-3-642-33093-3_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free