Exploring user-defined gestures and voice commands to control an unmanned aerial vehicle

19Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we follow a participatory design approach to explore what novice users find to be intuitive ways to control an Unmanned Aerial Vehicle (UAV). We gather users’ suggestions for suitable voice and gesture commands through an online survey and a video interview and we also record the voice commands and gestures used by participants’ in a Wizard of Oz experiment where participants thought they were manoeuvring a UAV. We identify commonalities in the data collected from the three elicitation methods and assemble a collection of voice and gesture command sets for navigating a UAV. Furthermore, to obtain a deeper understanding of why our participants chose the gestures and voice commands they did, we analyse and discuss the collected data in terms of mental models and identify three prevailing classes of mental models that likely guided many of our participants in their choice of voice and gesture commands.

Cite

CITATION STYLE

APA

Peshkova, E., Hitz, M., & Ahlström, D. (2017). Exploring user-defined gestures and voice commands to control an unmanned aerial vehicle. In Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST (Vol. 178, pp. 47–62). Springer Verlag. https://doi.org/10.1007/978-3-319-49616-0_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free