Natural demonstration of manipulation skills for multimodal interactive robots

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents a novel approach to natural demonstration of manipulation skills for multimodal interactive robots. The main focus is on the natural demonstration of manipulation skills, especially grasping skills. In order to teach grasping skills to a multimodal interactive robot, a human instructor makes use of natural spoken language and grasping actions demonstrated to the robot. The proposed approach emphasizes on four different aspects of learning by demonstration: First, the dialog system for processing natural speech is considered. Second, an object detection and classification scheme for the robot is shown. Third, the correspondence problem is addressed by an algorithm for visual tracking of the demonstrator's hands in real time and the transformation of the tracking results into an approach trajectory for a robotic arm. The fourth aspect addresses the fine-tuning of the robot's hand configuration for each grasp. It introduces a criterion to evaluate a grasp for stability and possible reuse of a grasped object. The approach produces stable grasps and is applied and evaluated on a multimodal service robot. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Hüser, M., Baier-Löwenstein, T., Svagusa, M., & Zhang, J. (2007). Natural demonstration of manipulation skills for multimodal interactive robots. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4555 LNCS, pp. 888–897). Springer Verlag. https://doi.org/10.1007/978-3-540-73281-5_97

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free