In this paper, we present an approach for a robot to provide personalized assistance for dressing a user. In particular, given a dressing task, our approach finds a solution involving manipulator motions and also user repositioning requests. Specifically, the solution allows the robot and user to take turns moving in the same space and is cognizant of the user’s limitations. To accomplish this, a vision module monitors the human’s motion, determines if he is following the repositioning requests, and infers mobility limitations when he cannot. The learned constraints are used during future dressing episodes to personalize the repositioning requests. Our contributions include a turn-taking approach to humanrobot coordination for the dressing problem and a vision module capable of learning user limitations. After presenting the technical details of our approach, we provide an evaluation with a Baxter manipulator.
CITATION STYLE
Klee, S. D., Ferreira, B. Q., Silva, R., Costeira, J. P., Melo, F. S., & Veloso, M. (2015). Personalized assistance for dressing users. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9388 LNCS, pp. 359–369). Springer Verlag. https://doi.org/10.1007/978-3-319-25554-5_36
Mendeley helps you to discover research relevant for your work.