Many people suffer from movement disabilities and would benefit from an assistive mobility device with practical control. This paper demonstrates a face-machine interface system that uses motion artifacts from electroencephalogram (EEG) signals for mobility enhancement in people with quadriplegia. We employed an Emotiv EPOC X neuroheadset to acquire EEG signals. With the proposed system, we verified the preprocessing approach, feature extraction algorithms, and control modalities. Incorporating eye winks and jaw movements, an average accuracy of 96.9% across four commands was achieved. Moreover, the online control results of a simulated power wheelchair showed high efficiency based on the time condition. The combination of winking and jaw chewing results in a steering time on the same order of magnitude as that of joystickbased control, but still about twice as long. We will further improve the efficiency and implement the proposed face-machine interface system for a real-power wheelchair.
CITATION STYLE
Saichoo, T., Boonbrahm, P., & Punsawad, Y. (2021). A face-machine interface utilizing EEG artifacts from a neuroheadset for simulated wheelchair control. International Journal on Smart Sensing and Intelligent Systems, 14(1), 1–10. https://doi.org/10.21307/ijssis-2021-015
Mendeley helps you to discover research relevant for your work.