Mobile Augmented Reality (AR) applications based on navigation frameworks try to promote interaction beyond the desktop by employing wearable sensors, which collect user's position, orientation or diverse types of activities. Most navigation frameworks track location and heading of the user in the global coordinate frame using Global Positioning System (GPS) data. On the other hand, in the wearable computing area researchers studied angular data of human body segments in the local coordinate frame using inertial orientation trackers. In this work, we introduce a combination of global and local coordinate frame approaches and provide a context-aware interaction framework for mobile devices by seamlessly changing Graphical User Interfaces (GUIs) for pedestrians navigating and working in urban environments. The system is designed and tested both on a Personal Digital Assistant (PDA) based navigation system prototype and ultra mobile PC based archaeological fieldwork assistant prototype. In both cases, the computing device is mounted with a GPS receiver and inertial orientation tracker. We introduce a method to estimate orientation of a mobile user's hand. The recognition algorithm is based on state transitions triggered by time-line analysis of pitch angle and angular velocity of the orientation tracker. The prototype system can differentiate between three postures successfully. We associated each posture with different contexts which are of interest for pedestrian navigation systems: investigation, navigation and idle.
CITATION STYLE
Kayalar, C., & Balcisoy, S. (2008). Natural Interaction Framework for Navigation Systems on Mobile Devices. In Advances in Human Computer Interaction. InTech. https://doi.org/10.5772/5928
Mendeley helps you to discover research relevant for your work.