A Study of Mobile Robot Control using EEG Emotiv Epoch Sensor

  • Victorio Yasin T
  • Pasila F
  • Lim R
Citations of this article
Mendeley users who have this article in their library.


© The Authors, published by EDP Sciences, 2018. The study was using an EEG Emotiv Epoc+ sensor to recognize brain activity for controlling a mobile robot's movement. The study used Emotiv Control Panel software for EEG command identification. The commands will be interfaced inside Mind Your OSCs software and processing software which processed inside an Arduino Controller. The output of the Arduino is a movement command (ie. forward, backward, turn left, and turn right). The training methods of the system composed of three sets of thinking mode. First, thinking with doing facial expressions. Second, thinking with visual help. Third, thinking mentally without any help. In the first set, there are two configurations thinking with facial expression help as command of the mobile robot. Final results of the system are the second facial expressions configuration as the best facial expressions method with success rate 88.33 %. The second facial expression configuration has overall response time 1.60175 s faster than the first facial expressions configuration. In these two methods have dominant signals on the frontal lobe. The second facial expressions method have overall respond time 6.12 and 9.53 s faster than thinking with visual, and thinking without help respectively.




Victorio Yasin, T., Pasila, F., & Lim, R. (2018). A Study of Mobile Robot Control using EEG Emotiv Epoch Sensor. MATEC Web of Conferences, 164, 01044. https://doi.org/10.1051/matecconf/201816401044

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free