Intelligent intent-aware touchscreen systems using gesture tracking with endpoint prediction

3Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Using an interactive display, such as a touchscreen, entails undertaking a pointing gesture and dedicating a considerable amount of attention to execute a selection task. In this paper, we give an overview of the concept of intent-aware interactive displays that can determine, early in the free hand pointing gesture, the icon/item the user intends to select on the touchscreen. This can notably reduce the pointing time, aid implementing effective selection facilitation routines and enhance the overall system accuracy as well as the user experience. Intent-aware displays employ a gesture tracking sensor in conjunction with novel probabilistic intent inference algorithms to predict the endpoint of a free hand pointing gesture. Real 3D pointing data is used to illustrate the usefulness and effectiveness of the proposed approach.

Cite

CITATION STYLE

APA

Ahmad, B. I., Langdon, P. M., Hardy, R., & Godsill, S. J. (2015). Intelligent intent-aware touchscreen systems using gesture tracking with endpoint prediction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9176, pp. 3–14). Springer Verlag. https://doi.org/10.1007/978-3-319-20681-3_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free