Gesture-Based Ιnteraction: Visual Gesture Mapping

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Gesture-based interaction allows for interacting with computers, machines and robots in an intuitive way without direct physical contact. The challenge is that there are no agreed-upon interaction patterns for gesture-based interaction in VR and AR environments. In this paper we have developed a set of 10 gestures and corresponding visualizations in the following categories of gestures: (1) directional movement, (2) flow control, (3) spatial orientation, (4) multifunctional gestures, and (5) tactile gestures. One of the multifunctional gestures and its visualization were selected for usability testing (N = 18) in a 3D car track simulator. We found that the visualization made it faster and easier to understand the interaction made the interaction more precise. Further, we learned that the visualization worked well as guidance to learn to control the car but could be removed after a while as the user had learned the interaction. By combining gestures from the library, gesture-based interaction can be used to control advanced machines, robots and drones in an intuitive and non-strenuous way.

Cite

CITATION STYLE

APA

Rise, K., & Alsos, O. A. (2020). Gesture-Based Ιnteraction: Visual Gesture Mapping. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12182 LNCS, pp. 106–124). Springer. https://doi.org/10.1007/978-3-030-49062-1_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free