Synthesizing human-like walking in constrained environments

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a new algorithm to generate plausible walking motion for high-DOF human-like articulated figures in constrained environments with multiple obstacles. Our approach combines hierarchical model decomposition with sample-based planning to efficiently compute a collision-free path in tight spaces. Furthermore, we use path perturbation and replanning techniques to satisfy the kinematic and dynamic constraints on the motion. In order to generate realistic human-like motion, we present a new motion blending algorithm that refines the path computed by the planner with motion capture data to compute a smooth and plausible trajectory. We demonstrate the results of generating motion corresponding to placing or lifting object, walking and bending for a 34-DOF articulated model.

Author supplied keywords

Cite

CITATION STYLE

APA

Pan, J., Zhang, L., & Manocha, D. (2013). Synthesizing human-like walking in constrained environments. In Cognitive Systems Monographs (Vol. 18, pp. 181–186). Springer Verlag. https://doi.org/10.1007/978-3-642-36368-9_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free