Video Category Classification Using Wireless EEG

4Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we present a novel idea where we analyzed EEG signals to classify what type of video a person is watching which we believe is the first step of a BCI based video recommender system. For this, we setup an experiment where 13 subjects were shown three different types of videos. To be able to classify each of these videos from the EEG data of the subjects with a very good classification accuracy, we carried out experiments with several state-of-the-art algorithms for each of the submodules (pre-processing, feature extraction, feature selection and classification) of the Signal Processing module of a BCI system in order to find out what combination of algorithms best predicts what type of video a person is watching. We found, the best results (80.0% with 32.32 ms average total execution time per subject) are obtained when data of channel AF8 are used (i.e. data recorded from the electrode located at the right frontal lobe of the brain). The combination of algorithms that achieved this highest average accuracy of 80.0% are FIR Least Squares, Welch Spectrum, Principal Component Analysis and Adaboost for the submodules pre-processing, feature extraction, feature selection and classification respectively.

Cite

CITATION STYLE

APA

Mutasim, A. K., Tipu, R. S., Raihanul Bashar, M., & Ashraful Amin, M. (2017). Video Category Classification Using Wireless EEG. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10654 LNAI, pp. 39–48). Springer Verlag. https://doi.org/10.1007/978-3-319-70772-3_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free