A hybrid brain–computer interface for real-life meal-assist robot control

18Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

Abstract

Assistant devices such as meal-assist robots aid individuals with disabilities and support the elderly in performing daily activities. However, existing meal-assist robots are inconvenient to operate due to non-intuitive user interfaces, requiring additional time and effort. Thus, we developed a hybrid brain–computer interface-based meal-assist robot system following three features that can be measured using scalp electrodes for electroencephalography. The following three procedures comprise a single meal cycle. (1) Triple eye-blinks (EBs) from the prefrontal channel were treated as activation for initiating the cycle. (2) Steady-state visual evoked potentials (SSVEPs) from occipital channels were used to select the food per the user’s intention. (3) Electromyograms (EMGs) were recorded from temporal channels as the users chewed the food to mark the end of a cycle and indicate readiness for starting the following meal. The accuracy, information transfer rate, and false positive rate during experiments on five subjects were as follows: accuracy (EBs/SSVEPs/EMGs) (%): (94.67/83.33/97.33); FPR (EBs/EMGs) (times/min): (0.11/0.08); ITR (SSVEPs) (bit/min): 20.41. These results revealed the feasibility of this assistive system. The proposed system allows users to eat on their own more naturally. Furthermore, it can increase the self-esteem of disabled and elderly peeople and enhance their quality of life.

References Powered by Scopus

Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs

619Citations
N/AReaders
Get full text

OpenViBE: An open-source software platform to design, test, and use brain-computer interfaces in real and virtual environments

589Citations
N/AReaders
Get full text

Current Status, Challenges, and Possible Solutions of EEG-Based Brain-Computer Interface: A Comprehensive Review

306Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Eye-Tracking Feature Extraction for Biometric Machine Learning

33Citations
N/AReaders
Get full text

Motion behavior of non-Newtonian fluid-solid interaction foods

13Citations
N/AReaders
Get full text

Real-time and accurate meal detection for meal-assisting robots

9Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Ha, J., Park, S., Im, C. H., & Kim, L. (2021). A hybrid brain–computer interface for real-life meal-assist robot control. Sensors, 21(13). https://doi.org/10.3390/s21134578

Readers' Seniority

Tooltip

Researcher 2

50%

Professor / Associate Prof. 1

25%

PhD / Post grad / Masters / Doc 1

25%

Readers' Discipline

Tooltip

Engineering 2

50%

Medicine and Dentistry 1

25%

Computer Science 1

25%

Article Metrics

Tooltip
Mentions
Blog Mentions: 1

Save time finding and organizing research with Mendeley

Sign up for free