Nonvisual, distal tracking of mobile remote agents in geosocial interaction

1Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With the recent introduction of mass-market mobile phones with location, bearing and acceleration sensing, we are on the cusp of significant progress in location-based interaction, and highly interactive mobile social networking. We propose that such systems must work when subject to typical uncertainties in the sensed or inferred context, such as user location, bearing and motion. In order to examine the feasibility of such a system we describe an experiment with an eyes-free, mobile implementation which allows users to find a target user, engage with them by pointing and tilting actions, then have their attention directed to a specific target. Although weaknesses in the design of the tilt-distance mapping were indicated, encouragingly, users were able to track the target, and engage with the other agent. © Springer-Verlag Berlin Heidelberg 2009.

Cite

CITATION STYLE

APA

Strachan, S., & Murray-Smith, R. (2009). Nonvisual, distal tracking of mobile remote agents in geosocial interaction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5561 LNCS, pp. 88–102). https://doi.org/10.1007/978-3-642-01721-6_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free