Electroencephalography (EEG) signals are the brain signals acquired using the non-invasive approach. Owing to the high portability and practicality, EEG signals have found extensive application in monitoring human physiological states across various domains. In recent years, deep learning methodologies have been explored to decode the intricate information embedded in EEG signals. However, since EEG signals are acquired from humans, it has issues with acquiring enormous amounts of data for training the deep learning models. Therefore, previous research has attempted to develop pre-trained models that could show significant performance improvement through fine-tuning when data are scarce. Nonetheless, existing pre-trained models often struggle with constraints, such as the necessity to operate within datasets of identical configurations or the need to distort the original data to apply the pre-trained model. In this paper, we proposed the domain-free transformer, called DFformer, for generalizing the EEG pre-trained model. In addition, we presented the pre-trained model based on DFformer, which is capable of seamless integration across diverse datasets without necessitating architectural modification or data distortion. The proposed model achieved competitive performance across motor imagery and sleep stage classification datasets. Notably, even when fine-tuned on datasets distinct from the pre-training phase, DFformer demonstrated marked performance enhancements. Hence, we demonstrate the potential of DFformer to overcome the conventional limitations in pre-trained model development, offering robust applicability across a spectrum of domains.
CITATION STYLE
Kim, S. J., Lee, D. H., Kwak, H. G., & Lee, S. W. (2024). Toward Domain-Free Transformer for Generalized EEG Pre-Training. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 32, 482–492. https://doi.org/10.1109/TNSRE.2024.3355434
Mendeley helps you to discover research relevant for your work.