On the Benefit of FMG and EMG Sensor Fusion for Gesture Recognition Using Cross-Subject Validation

被引:0
|
作者
Rohr, Maurice [1 ]
Haidamous, Jad [1 ]
Schaefer, Niklas [2 ]
Schaumann, Stephan [2 ]
Latsch, Bastian [2 ]
Kupnik, Mario [2 ]
Antink, Christoph Hoog [1 ]
机构
[1] Tech Univ Darmstadt, AI Syst Med Lab, D-64283 Darmstadt, Germany
[2] Tech Univ Darmstadt, Measurement & Sensor Technol Grp, D-64283 Darmstadt, Germany
关键词
Electromyography; Muscles; Gesture recognition; Force; Hands; Accuracy; Sensor fusion; Data acquisition; Cameras; Feature extraction; Electromyography (EMG); ferroelectrets; force myography (FMG); gesture recognition; sensor fusion; FORCE MYOGRAPHY; SURFACE ELECTROMYOGRAPHY; CLASSIFICATION; PIEZOELECTRETS;
D O I
10.1109/TNSRE.2025.3543649
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Hand gestures are a natural form of human communication, making gesture recognition a sensible approach for intuitive human-computer interaction. Wearable sensors on the forearm can be used to detect the muscle contractions that generate these gestures, but classification approaches relying on a single measured modality lack accuracy and robustness. In this work, we analyze sensor fusion of force myography (FMG) and electromyography (EMG) for gesture recognition. We employ piezoelectric FMG sensors based on ferroelectrets and a commercial EMG system in a user study with 13 participants to measure 66 distinct hand movements with 10ms labelling precision. Three classification tasks, namely flexion and extension, single finger, and all finger movement classification, are performed using common handcrafted features as input to machine learning classifiers. Subsequently, the evaluation covers the effectiveness of the sensor fusion using correlation analysis, classification performance based on leave-one-subject-out-cross-validation and 5x2cv-t-tests, and its effects of involuntary movements on classification. We find that sensor fusion leads to significant improvement (42% higher average recognition accuracy) on all three tasks and that both sensor modalities contain complementary information. Furthermore, we confirm this finding using reduced FMG and EMG sensor sets. This study reinforces the results of prior research about the effectiveness of sensor fusion by performing meticulous statistical analyses, thereby paving the way for multi-sensor gesture recognition in assistance systems.
引用
收藏
页码:935 / 944
页数:10
相关论文
共 50 条
  • [1] Cross-subject EMG hand gesture recognition based on dynamic domain generalization
    Ye, Yalan
    He, Yujie
    Pan, Tongjie
    Dong, Qiaosen
    Yuan, Jiajun
    zhou, Wengang
    2023 45TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY, EMBC, 2023,
  • [2] EMG Subspace Alignment and Visualization for Cross-Subject Hand Gesture Classification
    Colot, Martin
    Simar, Cedric
    Petieau, Mathieu
    Alvarez, Ana Maria Cebolla
    Cheron, Guy
    Bontempi, Gianluca
    MACHINE LEARNING AND PRINCIPLES AND PRACTICE OF KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2023, PT IV, 2025, 2136 : 416 - 423
  • [3] An extended variational autoencoder for cross-subject electromyograph gesture recognition
    Zhang, Zhen
    Ming, Yuewei
    Shen, Quming
    Wang, Yanyu
    Zhang, Yuhui
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2025, 99
  • [4] Transfer Learning Enhanced Cross-Subject Hand Gesture Recognition with sEMG
    Zhang, Shenyilang
    Fang, Yinfeng
    Wan, Jiacheng
    Jiang, Guozhang
    Li, Gongfa
    JOURNAL OF MEDICAL AND BIOLOGICAL ENGINEERING, 2023, 43 (06) : 672 - 688
  • [5] Transfer Learning Enhanced Cross-Subject Hand Gesture Recognition with sEMG
    Shenyilang Zhang
    Yinfeng Fang
    Jiacheng Wan
    Guozhang Jiang
    Gongfa Li
    Journal of Medical and Biological Engineering, 2023, 43 : 672 - 688
  • [6] Cross-Subject Multimodal Emotion Recognition Based on Hybrid Fusion
    Cimtay, Yucel
    Ekmekcioglu, Erhan
    Caglar-Ozhan, Seyma
    IEEE ACCESS, 2020, 8 : 168865 - 168878
  • [7] MASS: A Multisource Domain Adaptation Network for Cross-Subject Touch Gesture Recognition
    Li, Yun-Kai
    Meng, Qing-Hao
    Wang, Ya-Xin
    Yang, Tian-Hao
    Hou, Hui-Rang
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (03) : 3099 - 3108
  • [8] EMG-Based Cross-Subject Silent Speech Recognition Using Conditional Domain Adversarial Network
    Zhang, Yakun
    Cai, Huihui
    Wu, Jinghan
    Xie, Liang
    Xu, Minpeng
    Ming, Dong
    Yan, Ye
    Yin, Erwei
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2023, 15 (04) : 2282 - 2290
  • [9] Manifold Feature Fusion with Dynamical Feature Selection for Cross-Subject Emotion Recognition
    Hua, Yue
    Zhong, Xiaolong
    Zhang, Bingxue
    Yin, Zhong
    Zhang, Jianhua
    BRAIN SCIENCES, 2021, 11 (11)
  • [10] Cross-Subject Emotion Recognition Using Deep Adaptation Networks
    Li, He
    Jin, Yi-Ming
    Zheng, Wei-Long
    Lu, Bao-Liang
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT V, 2018, 11305 : 403 - 413