Gait Recognition in Different Terrains with IMUs Based on Attention Mechanism Feature Fusion Method

被引:0
|
作者
Mengxue Yan
Ming Guo
Jianqiang Sun
Jianlong Qiu
Xiangyong Chen
机构
[1] Linyi University,School of Automation and Electrical Engineering
来源
Neural Processing Letters | 2023年 / 55卷
关键词
Gait recognition; Inertial measurement unit; Lightweight convolutional neural network; Attention mechanism; Feature fusion;
D O I
暂无
中图分类号
学科分类号
摘要
Gait recognition is significant in the fields of disease diagnosis and rehabilitation training by studying the characteristics of human gait with different terrain. To address the problem that the transformation of different outdoor terrains can affect the gait of walkers, a gait recognition algorithm based on feature fusion with attention mechanism is proposed. First, the acceleration, angular velocity and angle information collected by the inertial measurement unit is used; then the acquired inertial gait data is divided into periods to obtain the period data of each step; then the features are extracted from the data, followed by the visualization of the one-dimensional data into two-dimensional images. A lightweight model is designed to combine convolutional neural network with attention mechanism, and a new attention mechanism-based feature fusion method is proposed in this paper for extracting features from multiple sensors and fusing them for gait recognition. The comparison experimental results show that the recognition accuracy of the model proposed in this paper can reach 89%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document}, and it has good recognition effect on gait under different terrain.
引用
收藏
页码:10215 / 10234
页数:19
相关论文
共 50 条
  • [31] A Semantic Segmentation Method of Remote Sensing Image Based on Feature Fusion and Attention Mechanism
    Wang, Yiqin
    Dong, Yunyun
    JOURNAL OF INFORMATION PROCESSING SYSTEMS, 2024, 20 (05): : 640 - 653
  • [32] A method of knowledge distillation based on feature fusion and attention mechanism for complex traffic scenes
    Li, Cui-jin
    Qu, Zhong
    Wang, Sheng-ye
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 124
  • [33] A Gait Recognition Method Based on Features Fusion and SVM
    Ni, Jian
    Liang, Li-bo
    PROCEEDINGS OF THE 2009 SECOND PACIFIC-ASIA CONFERENCE ON WEB MINING AND WEB-BASED APPLICATION, 2009, : 43 - 46
  • [34] A comprehensive study on codebook-based feature fusion for gait recognition
    Khan, Muhammad Hassan
    Farid, Muhammad Shahid
    Grzegorzek, Marcin
    INFORMATION FUSION, 2023, 92 : 216 - 230
  • [35] AMFF: A new attention-based multi-feature fusion method for intention recognition
    Liu, Cong
    Xu, Xiaolong
    KNOWLEDGE-BASED SYSTEMS, 2021, 233
  • [36] Lightweight facial expression recognition method based on attention mechanism and key region fusion
    Kong, Yinghui
    Ren, Zhaohan
    Zhang, Ke
    Zhang, Shuaitong
    Ni, Qiang
    Han, Jungong
    JOURNAL OF ELECTRONIC IMAGING, 2021, 30 (06)
  • [37] Gait recognition via weighted global-local feature fusion and attention-based multiscale temporal aggregation
    Xu, Yingqi
    Xi, Hao
    Ren, Kai
    Zhu, Qiyuan
    Hu, Chuanping
    JOURNAL OF ELECTRONIC IMAGING, 2025, 34 (01)
  • [38] Object Detection Network Based on Feature Fusion and Attention Mechanism
    Zhang, Ying
    Chen, Yimin
    Huang, Chen
    Gao, Mingke
    FUTURE INTERNET, 2019, 11 (01):
  • [39] An Improved YOLOv5 Model Based on Feature Fusion and Attention Mechanism for Multiscale Satellite Recognition
    Shen, Naijun
    Xv, Rui
    Gao, Yang
    Qian, Chen
    Chen, Qingwei
    IEEE SENSORS JOURNAL, 2024, 24 (12) : 19385 - 19396
  • [40] Facial expression recognition based on attention mechanism and feature correlation
    Lan L.
    Liu Q.
    Lu S.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2021, 48 (01): : 147 - 155