EEG-Controlled prosthetic Arm for micromechanical tasks

5Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Brain-controlled prosthetics has become one of the significant areas in brain–computer interface (BCI) research. A novel approach is introduced in this paper to extract eyeblink signals from EEG to control a prosthetic arm. The coded eyeblinks are extracted and used as a major task commands for control of prosthetic arm movement. The prosthetic arm is built using 3D printing technology. The major task is converted to micromechanical tasks by the microcontroller. In order to classify the commands, features are extracted in time and spectral domain of the EEG signals using machine learning methods. The two classification techniques used are: Linear Discriminant Analysis (LDA) and K-Nearest Neighbor (KNN). EEG data was obtained from 10 healthy subjects and the performance of the system was evaluated for accuracy, precision, and recall measures. The methods gave accuracy, precision and recall for LDA as 97.7%, 96%, and 95.3% and KNN as 70.7%, 67.3%, and 68% respectively.

Cite

CITATION STYLE

APA

Gayathri, G., Udupa, G., Nair, G. J., & Poorna, S. S. (2018). EEG-Controlled prosthetic Arm for micromechanical tasks. In Advances in Intelligent Systems and Computing (Vol. 712, pp. 281–291). Springer Verlag. https://doi.org/10.1007/978-981-10-8228-3_26

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free