EEG-TCNTransformer: A Temporal Convolutional Transformer for Motor Imagery Brain-Computer Interfaces

被引:0
|
作者
Nguyen, Anh Hoang Phuc [1 ]
Oyefisayo, Oluwabunmi [1 ]
Pfeffer, Maximilian Achim [1 ]
Ling, Sai Ho [1 ]
机构
[1] Univ Technol Sydney, Fac Engn & Informat Technol, Ultimo, NSW 2007, Australia
来源
SIGNALS | 2024年 / 5卷 / 03期
关键词
brain-computer interface; motor imagery; electroencephalography; convolutional neural network; transformer; self-attention; bandpass filter; TIME;
D O I
10.3390/signals5030034
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In brain-computer interface motor imagery (BCI-MI) systems, convolutional neural networks (CNNs) have traditionally dominated as the deep learning method of choice, demonstrating significant advancements in state-of-the-art studies. Recently, Transformer models with attention mechanisms have emerged as a sophisticated technique, enhancing the capture of long-term dependencies and intricate feature relationships in BCI-MI. This research investigates the performance of EEG-TCNet and EEG-Conformer models, which are trained and validated using various hyperparameters and bandpass filters during preprocessing to assess improvements in model accuracy. Additionally, this study introduces EEG-TCNTransformer, a novel model that integrates the convolutional architecture of EEG-TCNet with a series of self-attention blocks employing a multi-head structure. EEG-TCNTransformer achieves an accuracy of 83.41% without the application of bandpass filtering.
引用
收藏
页码:605 / 632
页数:28
相关论文
共 50 条
  • [41] Deep Temporal-Spatial Feature Learning for Motor Imagery-Based Brain-Computer Interfaces
    Chen, Junjian
    Yu, Zhuliang
    Gu, Zhenghui
    Li, Yuanqing
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2020, 28 (11) : 2356 - 2366
  • [42] Motor task-to-task transfer learning for motor imagery brain-computer interfaces
    Gwon, Daeun
    Ahn, Minkyu
    NEUROIMAGE, 2024, 302
  • [43] Convolutional neural network based features for motor imagery EEG signals classification in brain-computer interface system
    Taheri, Samaneh
    Ezoji, Mehdi
    Sakhaei, Sayed Mahmoud
    SN APPLIED SCIENCES, 2020, 2 (04):
  • [44] Motor imagery based brain-computer interfaces: An emerging technology to rehabilitate motor deficits
    Maria Alonso-Valerdi, Luz
    Antonio Salido-Ruiz, Ricardo
    Ramirez-Mendoza, Ricardo A.
    NEUROPSYCHOLOGIA, 2015, 79 : 354 - 363
  • [45] IENet: a robust convolutional neural network for EEG based brain-computer interfaces
    Du, Yipeng
    Liu, Jian
    JOURNAL OF NEURAL ENGINEERING, 2022, 19 (03)
  • [46] Hilbert-Huang Time-Frequency Analysis of Motor Imagery EEG Data for Brain-Computer Interfaces
    Jerbic, Ana Branka
    Horki, Petar
    Sovilj, Sinisa
    Isgum, Velimir
    Cifrek, Mario
    6TH EUROPEAN CONFERENCE OF THE INTERNATIONAL FEDERATION FOR MEDICAL AND BIOLOGICAL ENGINEERING, 2015, 45 : 62 - +
  • [47] EFFICIENT AUTOMATIC SELECTION AND COMBINATION OF EEG FEATURES IN LEAST SQUARES CLASSIFIERS FOR MOTOR IMAGERY BRAIN-COMPUTER INTERFACES
    Rodriguez-Bermudez, German
    Garcia-Laencina, Pedro J.
    Roca-Dorda, Joaquin
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2013, 23 (04)
  • [48] Motor imagery classification for Brain-Computer Interfaces through a chaotic neural network
    de Moraes Piazentin, Denis Renato
    Garcia Rosa, Joao Luis
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 4103 - 4108
  • [49] Transfer Learning Based on Optimal Transport for Motor Imagery Brain-Computer Interfaces
    Peterson, Victoria
    Nieto, Nicolas
    Wyser, Dominik
    Lambercy, Olivier
    Gassert, Roger
    Milone, Diego H.
    Spies, Ruben D.
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2022, 69 (02) : 807 - 817
  • [50] The Efficiency of the Brain-Computer Interfaces Based on Motor Imagery with Tactile and Visual Feedback
    Lukoyanov M.V.
    Gordleeva S.Y.
    Pimashkin A.S.
    Grigor’ev N.A.
    Savosenkov A.V.
    Motailo A.
    Kazantsev V.B.
    Kaplan A.Y.
    Human Physiology, 2018, 44 (3) : 280 - 288