Decoding Behavior Tasks from Brain Activity Using Deep Transfer Learning

28Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Recently, advances in noninvasive detection techniques have shown that it is possible to decode visual information from measurable brain activities. However, these studies typically focused on the mapping between neural activities and visual information, such as the image or video stimulus, on the individual level. Here, the common decoding models across individuals that classifying behavior tasks from brain signals were investigated. We proposed a cross-subject decoding approach using deep transfer learning (DTL) to decipher the behavior tasks from functional magnetic resonance imaging (fMRI) recording during subjects performing different tasks. We connected parts of the state-of-the-art networks pre-trained on the ImageNet dataset to our defined adaption layers to classify the behavior tasks from fMRI data. Our experiments on the Human Connectome Project (HCP) dataset showed that the proposed method achieved a higher decoding accuracy across subjects than the previous studies. We also conducted an experiment on five subsets of HCP data, which further demonstrated that our DTL approach is more effective on small dataset than the traditional methods.

Cite

CITATION STYLE

APA

Gao, Y., Zhang, Y., Wang, H., Guo, X., & Zhang, J. (2019). Decoding Behavior Tasks from Brain Activity Using Deep Transfer Learning. IEEE Access, 7, 43222–43232. https://doi.org/10.1109/ACCESS.2019.2907040

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free