Appearance-Based Gaze Estimator for Natural Interaction Control of Surgical Robots

29Citations
Citations of this article
35Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Robots are playing an increasingly important role in modern surgery. However, conventional human-computer interaction methods, such as joystick control and sound control, have some shortcomings, and medical personnel are required to specifically practice operating the robot. We propose a human-computer interaction model based on eye movement with which medical staff can conveniently use their eye movements to control the robot. Our algorithm requires only an RGB camera to perform tasks without requiring expensive eye-tracking devices. Two kinds of eye control modes are designed in this paper. The first type is the pick and place movement, with which the user uses eye gaze to specify the point where the robotic arm is required to move. The second type is user command movement, with which the user can use eye gaze to select the direction in which the user desires the robot to move. The experimental results demonstrate the feasibility and convenience of these two modes of movement.

Cite

CITATION STYLE

APA

Li, P., Hou, X., Duan, X., Yip, H., Song, G., & Liu, Y. (2019). Appearance-Based Gaze Estimator for Natural Interaction Control of Surgical Robots. IEEE Access, 7, 25095–25110. https://doi.org/10.1109/ACCESS.2019.2900424

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free