Augmenting looking, pointing and reaching gestures to enhance the searching and browsing of physical objects

25Citations
Citations of this article
47Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we present a framework for attaching information to physical objects in a way that can be interactively browsed and searched in a hands-free, multi-modal, and personalized manner that leverages users' natural looking, pointing and reaching behaviors. The system uses small infrared transponders on objects in the environment and worn by the user to achieve dense, on-object visual feedback usually possible only in augmented reality systems, while improving on interaction style and requirements for wearable gear. We discuss two applications that have been implemented, a tutorial about the parts of an automobile engine and a personalized supermarket assistant. The paper continues with a user study investigating browsing and searching behaviors in the supermarket scenario, and concludes with a discussion of findings and future work. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Merrill, D., & Maes, P. (2007). Augmenting looking, pointing and reaching gestures to enhance the searching and browsing of physical objects. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4480 LNCS, pp. 1–18). Springer Verlag. https://doi.org/10.1007/978-3-540-72037-9_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free