GUD WIP: Gait-understanding-driven walking-in-place

104Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Many Virtual Environments require walking interfaces to explore virtual worlds much larger than available real-world tracked space. We present a model for generating virtual locomotion speeds from Walking-In-Place (WIP) inputs based on walking biomechanics. By employing gait principles, our model - called Gait-Understanding-DrivenWalking-In-Place (GUD WIP) - creates output speeds which better match those evident in Real Walking, and which better respond to variations in step frequency, including realistic starting and stopping. The speeds output by our implementation demonstrate considerably less within-step fluctuation than a good current WIP system - Low-Latency, Continuous-Motion (LLCM) WIP - while still remaining responsive to changes in user input. We compared resulting speeds from Real Walking, GUD WIP, and LLCM-WIP via user study: The average output speeds for Real Walking and GUD WIP respond consistently with changing step frequency - LLCM-WIP is far less consistent. GUD WIP produces output speeds that are more locally consistent (smooth) and step-frequency-to-walk-speed consistent than LLCM-WIP. ©2010 IEEE.

Cite

CITATION STYLE

APA

Wendt, J. D., Whitton, M. C., & Brooks, F. P. (2010). GUD WIP: Gait-understanding-driven walking-in-place. In Proceedings - IEEE Virtual Reality (pp. 51–58). https://doi.org/10.1109/VR.2010.5444812

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free