GUD WIP: Gait-understanding-driven walking-in-place

  • Wendt J
  • Whitton M
  • Brooks F
  • 35


    Mendeley users who have this article in their library.
  • 39


    Citations of this article.


Many Virtual Environments require walking interfaces to explore virtual worlds much larger than available real-world tracked space. We present a model for generating virtual locomotion speeds from Walking-In-Place (WIP) inputs based on walking biomechanics. By employing gait principles, our model - called Gait-Understanding-Driven Walking-In-Place (GUD WIP) - creates output speeds which better match those evident in Real Walking, and which better respond to variations in step frequency, including realistic starting and stopping. The speeds output by our implementation demonstrate considerably less within-step fluctuation than a good current WIP system - Low-Latency, Continuous-Motion (LLCM) WIP - while still remaining responsive to changes in user input. We compared resulting speeds from Real Walking, GUD WIP, and LLCM-WIP via user study: The average output speeds for Real Walking and GUD WIP respond consistently with changing step frequency - LLCM-WIP is far less consistent. GUD WIP produces output speeds that are more locally consistent (smooth) and step-frequency-to-walk-speed consistent than LLCM-WIP.

Author-supplied keywords

  • H.5.1 [information interfaces and presentation]: multimeda information systems - artificial, augmented, and virtual realities H.5.2 [information interfaces and presentation]: user interfaces - input devices and strategies I.3.6

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


  • Mary WhittonUniversity of North Carolina at Chapel Hill

  • Jeremy D. Wendt

  • Frederick P. Brooks

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free