A real-time sensing of gait and viewing direction for human interaction in virtual training applications

1Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents an integrated framework for real-time sensing and synchronization of both user’s moving speed with direction and viewing direction in walking-in-place experience for virtual training applications. The framework consists of two inertial measurement units (IMU) attached to each shank and a HMDmade up of Android mobile device with 3-axis orientation sensor. Although there are several prior works to enable unconstrained omnidirectional walking through virtual environments, an implementation of the low cost interface solution using wearable devices is an important issue for virtual training systems. We provide a simplified technique for implementing ‘Walking in Virtual Reality’ without omnidirectional treadmill. In addition, this research aims to lightweight (in point of software) and portable (in point of hardware) solution to implement the Virtual Reality Walk-In-Place(VR WIP) interface for training applications.

Cite

CITATION STYLE

APA

Ha, G., Lee, S., Cha, J., Lee, H., Kim, T., & Kim, S. (2015). A real-time sensing of gait and viewing direction for human interaction in virtual training applications. In Communications in Computer and Information Science (Vol. 528, pp. 485–490). Springer Verlag. https://doi.org/10.1007/978-3-319-21380-4_82

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free