Riding feeling recognition based on multi-head self-attention LSTM for driverless automobile

被引:1
|
作者
Tang, Xianzhi [1 ]
Xie, Yongjia [1 ]
Li, Xinlong [1 ]
Wang, Bo [1 ]
机构
[1] Yanshan Univ, Sch Vehicles & Energy, Hebei Key Lab Special Carrier Equipment, Hebei St, Qinhuangdao 066004, Hebei, Peoples R China
基金
中国国家自然科学基金;
关键词
Electroencephalography (EEG); Attention; Feature extraction; Driving experience;
D O I
10.1016/j.patcog.2024.111135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the emergence of driverless technology, passenger ride comfort has become an issue of concern. In recent years, driving fatigue detection and braking sensation evaluation based on EEG signals have received more attention, and analyzing ride comfort using EEG signals is also a more intuitive method. However, it is still a challenge to find an effective method or model to evaluate passenger comfort. In this paper, we propose a longand short-term memory network model based on a multiple self-attention mechanism for passenger comfort detection. By applying the multiple attention mechanism to the feature extraction process, more efficient classification results are obtained. The results show that the long- and short-term memory network using the multihead self-attention mechanism is efficient in decision making along with higher classification accuracy. In conclusion, the classifier based on the multi-head attention mechanism proposed in this paper has excellent performance in EEG classification of different emotional states, and has a broad development prospect in braincomputer interaction.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] PMMS: Predicting essential miRNAs based on multi-head self-attention mechanism and sequences
    Yan, Cheng
    Ding, Changsong
    Duan, Guihua
    FRONTIERS IN MEDICINE, 2022, 9
  • [42] A relational extraction approach based on multiple embedding representations and multi-head self-attention
    Qin, Zhi
    Liu, Enyang
    Zhang, Shibin
    Chang, Yan
    Yan, Lili
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2024, 46 (03) : 7093 - 7107
  • [43] MSIN: An Efficient Multi-head Self-attention Framework for Inertial Navigation
    Shi, Gaotao
    Pan, Bingjia
    Ni, Yuzhi
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2023, PT I, 2024, 14487 : 455 - 473
  • [44] Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification
    Yaojie Zhang
    Bing Xu
    Tiejun Zhao
    IEEE/CAAJournalofAutomaticaSinica, 2020, 7 (04) : 1038 - 1044
  • [45] MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding
    Park, Geondo
    Han, Chihye
    Kim, Daeshik
    Yoon, Wonjun
    2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 1507 - 1515
  • [46] Hunt for Unseen Intrusion: Multi-Head Self-Attention Neural Detector
    Seo, Seongyun
    Han, Sungmin
    Park, Janghyeon
    Shim, Shinwoo
    Ryu, Han-Eul
    Cho, Byoungmo
    Lee, Sangkyun
    IEEE ACCESS, 2021, 9 : 129635 - 129647
  • [47] A Speech Recognition Model Building Method Combined Dynamic Convolution and Multi-Head Self-Attention Mechanism
    Liu, Wei
    Sun, Jiaming
    Sun, Yiming
    Chen, Chunyi
    ELECTRONICS, 2022, 11 (10)
  • [48] Modality attention fusion model with hybrid multi-head self-attention for video understanding
    Zhuang, Xuqiang
    Liu, Fang'al
    Hou, Jian
    Hao, Jianhua
    Cai, Xiaohong
    PLOS ONE, 2022, 17 (10):
  • [49] A novel unsupervised deep learning approach for vibration-based damage diagnosis using a multi-head self-attention LSTM autoencoder
    Ghazimoghadam, Shayan
    Hosseinzadeh, S. A. A.
    MEASUREMENT, 2024, 229
  • [50] An Aerial Target Recognition Algorithm Based on Self-Attention and LSTM
    Liang, Futai
    Chen, Xin
    He, Song
    Song, Zihao
    Lu, Hao
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 81 (01): : 1101 - 1121