This work reviews the critical challenge of data scarcity in developing Transformer-based models for Electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs), specifically focusing on Motor Imagery (MI) decoding. While EEG-BCIs hold immense promise for applications in communication, rehabilitation, and human-computer interaction, limited data availability hinders the use of advanced deep-learning models such as Transformers. In particular, this paper comprehensively analyzes three key strategies to address data scarcity: data augmentation, transfer learning, and the inherent attention mechanisms of Transformers. Data augmentation techniques artificially expand datasets, enhancing model generalizability by exposing them to a wider range of signal patterns. Transfer learning utilizes pre-trained models from related domains, leveraging their learned knowledge to overcome the limitations of small EEG datasets. By thoroughly reviewing current research and methodologies, this work underscores the importance of these strategies in overcoming data scarcity. It critically examines the limitations imposed by limited datasets and showcases potential solutions being developed to address these challenges. This comprehensive survey, focusing on the intersection of data scarcity and technological advancements, aims to provide a critical analysis of the current state-of-the-art in EEG-BCI development. By identifying research gaps and suggesting future directions, the paper encourages further exploration and innovation in this field. Ultimately, this work aims to contribute to the advancement of more accessible, efficient, and accurate EEG-BCI systems by addressing the fundamental challenge of data scarcity.
CITATION STYLE
Keutayeva, A., & Abibullaev, B. (2024). Data Constraints and Performance Optimization for Transformer-Based Models in EEG-Based Brain-Computer Interfaces: A Survey. IEEE Access, 12, 62628–62647. https://doi.org/10.1109/ACCESS.2024.3394696
Mendeley helps you to discover research relevant for your work.