There has been a lot of discussion in recent years around the disappearing computer concept and most of the results of that discussion have been realized in the form of mobile devices and applications. What has got lost a little in this discussion is the moves that have seen the miniaturization of sensors that can be wirelessly attached to places and to humans in order to provide a new type of free flowing interaction. In order to investigate what these new sensors could achieve and at what cost, we implemented a configurable, wearable motion-capture system based on wireless sensor nodes, requiring no special environment to operate in. We discuss the system architecture and discuss the implications and opportunities afforded by it for innovative HCI design. As a practical application of the technology, we describe a prototype implementation of a pervasive, wearable augmented reality (AR) system based on the motion-capture system. The AR application uses body motion to visualize and interact with virtual objects populating AR settings. Body motion is used to implement a whole body gesture-driven interface to manipulate the virtual objects. Gestures are mapped to corresponding behaviours for virtual objects, such as controlling the playback and volume of virtual audio players or displaying a virtual object's metadata.
CITATION STYLE
Smit, P., Barrie, P., Komninos, A., & Mandrychenko, O. (2011). Mirrored Motion: Augmenting Reality and Implementing Whole Body Gestural Control Using Pervasive Body Motion Capture Based on Wireless Sensors (pp. 35–50). https://doi.org/10.1007/978-0-85729-433-3_4
Mendeley helps you to discover research relevant for your work.