Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation

  • Yu M
  • Lin Y
  • Schmidt D
  • et al.
N/ACitations
Citations of this article
44Readers
Mendeley users who have this article in their library.

Abstract

Teleoperation has been widely used to perform tasks in dangerous and unreachable environments by replacing humans with controlled agents. The idea of human-robot interaction (HRI) is very important in teleoperation. Conventional HRI input devices include keyboard, mouse and joystick, etc. However, they are not suitable for handicapped users or people with disabilities. These devices also increase the mental workload of normal users due to simultaneous operation of multiple HRI input devices by hand. Hence, HRI based on gaze tracking with an eye tracker is presented in this study. The selection of objects is of great importance and occurs at a high frequency during HRI control. This paper introduces gaze gestures as an object selection strategy into HRI for drone teleoperation. In order to test and validate the performance of gaze gestures selection strategy, we evaluate objective and subjective measurements, respectively. Drone control performance, including mean task completion time and mean error rate, are the objective measurements. The subjective measurement is the analysis of participant perception. The results showed gaze gestures selection strategy has a great potential as an additional HRI for use in agent teleoperation.

Cite

CITATION STYLE

APA

Yu, M., Lin, Y., Schmidt, D., Wang, X., & Wang, Y. (2014). Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation. Journal of Eye Movement Research, 7(4). https://doi.org/10.16910/jemr.7.4.4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free