Multimodal Autoencoder Predicts fNIRS Resting State From EEG Signals

19Citations
Citations of this article
64Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this work, we introduce a deep learning architecture for evaluation on multimodal electroencephalographic (EEG) and functional near-infrared spectroscopy (fNIRS) recordings from 40 epileptic patients. Long short-term memory units and convolutional neural networks are integrated within a multimodal sequence-to-sequence autoencoder. The trained neural network predicts fNIRS signals from EEG, sans a priori, by hierarchically extracting deep features from EEG full spectra and specific EEG frequency bands. Results show that higher frequency EEG ranges are predictive of fNIRS signals with the gamma band inputs dominating fNIRS prediction as compared to other frequency envelopes. Seed based functional connectivity validates similar patterns between experimental fNIRS and our model’s fNIRS reconstructions. This is the first study that shows it is possible to predict brain hemodynamics (fNIRS) from encoded neural data (EEG) in the resting human epileptic brain based on power spectrum amplitude modulation of frequency oscillations in the context of specific hypotheses about how EEG frequency bands decode fNIRS signals.

Cite

CITATION STYLE

APA

Sirpal, P., Damseh, R., Peng, K., Nguyen, D. K., & Lesage, F. (2022). Multimodal Autoencoder Predicts fNIRS Resting State From EEG Signals. Neuroinformatics, 20(3), 537–558. https://doi.org/10.1007/s12021-021-09538-3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free