Robot Learning with Task-Parameterized Generative Models

18Citations
Citations of this article
82Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Task-parameterized models provide a representation of movement/behavior that can adapt to a set of task parameters describing the current situation encountered by the robot, such as location of objects or landmarks in its workspace. This paper gives an overview of the task-parameterized Gaussian mixture model (TP-GMM) presented in previous publications, and introduces a number of extensions and ongoing challenges required to move the approach toward unconstrained environments. In particular, it discusses its generalization capability and the handling of movements with a high number of degrees of freedom. It then shows that the method is not restricted to movements in task space, but that it can also be exploited to handle constraints in joint space, including priority constraints.

Cite

CITATION STYLE

APA

Calinon, S. (2018). Robot Learning with Task-Parameterized Generative Models. In Springer Proceedings in Advanced Robotics (Vol. 3, pp. 111–126). Springer Science and Business Media B.V. https://doi.org/10.1007/978-3-319-60916-4_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free