Intelligent Eye-Controlled Electric Wheelchair Based on Estimating Visual Intentions Using One-Dimensional Convolutional Neural Network and Long Short-Term Memory

4Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

When an electric wheelchair is operated using gaze motion, eye movements such as checking the environment and observing objects are also incorrectly recognized as input operations. This phenomenon is called the “Midas touch problem”, and classifying visual intentions is extremely important. In this paper, we develop a deep learning model that estimates the user’s visual intention in real time and an electric wheelchair control system that combines intention estimation and the gaze dwell time method. The proposed model consists of a 1DCNN-LSTM that estimates visual intention from feature vectors of 10 variables, such as eye movement, head movement, and distance to the fixation point. The evaluation experiments classifying four types of visual intentions show that the proposed model has the highest accuracy compared to other models. In addition, the results of the driving experiments of the electric wheelchair implementing the proposed model show that the user’s efforts to operate the wheelchair are reduced and that the operability of the wheelchair is improved compared to the traditional method. From these results, we concluded that visual intentions could be more accurately estimated by learning time series patterns from eye and head movement data.

References Powered by Scopus

Random forests

95842Citations
N/AReaders
Get full text

Long Short-Term Memory

77718Citations
N/AReaders
Get full text

Deep learning

64184Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Eye-Gaze Controlled Wheelchair Based on Deep Learning

12Citations
N/AReaders
Get full text

Driving Assistance System with Obstacle Avoidance for Electric Wheelchairs

1Citations
N/AReaders
Get full text

Vision-Based Hand Gesture Recognition Using a YOLOv8n Model for the Navigation of a Smart Wheelchair

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Higa, S., Yamada, K., & Kamisato, S. (2023). Intelligent Eye-Controlled Electric Wheelchair Based on Estimating Visual Intentions Using One-Dimensional Convolutional Neural Network and Long Short-Term Memory. Sensors, 23(8). https://doi.org/10.3390/s23084028

Readers' Seniority

Tooltip

Researcher 2

50%

Lecturer / Post doc 1

25%

PhD / Post grad / Masters / Doc 1

25%

Readers' Discipline

Tooltip

Computer Science 2

50%

Medicine and Dentistry 1

25%

Neuroscience 1

25%

Save time finding and organizing research with Mendeley

Sign up for free