Brain-computer interface: Controlling a robotic arm using facial expressions

13Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The aim of this paper is to develop a brain{computer interface (BCI) system that can control a robotic arm using EEG signals generated by facial expressions. The EEG signals are acquired using a neurosignal acquisition headset. The robotic arm consists of a 3-D printed prosthetic hand that is attached to a forearm and elbow made of craft wood. The arm is designed to make four moves. Each move is controlled by one facial expression. Hence, four different EEG signals are used in this work. The performance of the BCI robotic arm is evaluated by testing it on 10 subjects. Initially 14 electrodes were used to collect the EEG signals, and the accuracy of the system is around 95%. We have further analyzed the minimum requirement for the number of electrodes for the system to function properly. Seven (instead of 14) electrodes in the parietal, temporal, and frontal regions are sufficient for the system to function properly. The accuracy of the system with 7 electrodes is around 95%.

Cite

CITATION STYLE

APA

Nisar, H., Khow, H. W., & Yeap, K. H. (2018). Brain-computer interface: Controlling a robotic arm using facial expressions. Turkish Journal of Electrical Engineering and Computer Sciences, 26(2), 707–720. https://doi.org/10.3906/elk-1606-296

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free