Convolutional neural network (CNN) has been gradually applied to steady-state visual evoked potential (SSVEP) of the brain-computer interface (BCI). Frequency-domain features extracted by fast Fourier Transform (FFT) or time-domain signals are used as network input. In the frequency-domain diagram, the features at the short time-window are not obvious and the phase information of each electrode channel may be ignored as well. Hence we propose a time-domain-based CNN method (tCNN), using the time-domain signal as network input. And the filter bank tCNN (FB-tCNN) is further proposed to improve its performance in the short time-window. We compare FB-tCNN with the canonical correlation analysis (CCA) methods and other CNN methods in our dataset and public dataset. And FB-tCNN shows superior performance at the short time-window in the intra-individual test. At the 0.2 s time-window, the accuracy of our method reaches $88.36\, \pm \,4.89$ % in our dataset, $77.78\, \pm \,2.16$ % and $79.21\, \pm \,1.80$ % respectively in the two sessions of the public dataset, which is higher than other methods. The impacts of training-subject number and data length in inter-individual or cross-individual are studied. FB-tCNN shows the potential in implementing inter-individual BCI. Further analysis shows that the deep learning method is easier in terms of the implementation of the asynchronous BCI system than the training data-driven CCA. The code is available for reproducibility at https://github.com/DingWenl/FB-tCNN.
CITATION STYLE
Ding, W., Shan, J., Fang, B., Wang, C., Sun, F., & Li, X. (2021). Filter Bank Convolutional Neural Network for Short Time-Window Steady-State Visual Evoked Potential Classification. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 29, 2615–2624. https://doi.org/10.1109/TNSRE.2021.3132162
Mendeley helps you to discover research relevant for your work.