Development of EOG and EMG-based multimodal assistive systems

5Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This study discusses a human-computer interface (HCI)- based novel approach for designing a computer-aided control and communication system using electrooculogram (EOG) and electromyogram (EMG) signals for people with severe hindrance to motor activities and communication. The EOG and EMG signals were attributed to eye movements and voluntary eye blinks, respectively. The acquired signals were processed and classified in a MATLAB-based graphical user interface (GUI) to detect different eye movements. A couple of Hall-effect sensors were conditioned to be used concurrently with multidirectional eye movements or voluntary eye blinks to generate multipurpose serial commands to control the movement of a robotic vehicle (representative assistive aid) and communications support systems. The user details were registered and the system operability was monitored in the same GUI. Due to multitasking and ease of use of the proposed device, the quality of life of the incapacitated individuals can be improved with greater independence.

Cite

CITATION STYLE

APA

Champaty, B., Tibarewala, D. N., Mohapatra, B., & Pal, K. (2016). Development of EOG and EMG-based multimodal assistive systems. In Studies in Computational Intelligence (Vol. 651, pp. 285–310). Springer Verlag. https://doi.org/10.1007/978-3-319-33793-7_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free