Surface Electromyography Signal Recognition Based on Deep Learning for Human-Robot Interaction and Collaboration

被引:23
|
作者
Mendes, Nuno [1 ]
机构
[1] NOVA Univ Lisbon, Res & Dev Unit Mech & Ind Engn, Campus Caparica, P-2829516 Lisbon, Portugal
关键词
Pattern recognition; Data segmentation; Deep learning; Surface electromyography; Robotics; Industry; 4; 0; UNSUPERVISED GESTURE SEGMENTATION; HAND; CLASSIFICATION; SCHEME;
D O I
10.1007/s10846-022-01666-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The interaction between humans and collaborative robots in performing given tasks has aroused the interest of researchers and industry for the development of gesture recognition systems. Surface electromyography (sEMG) devices are recommended to capture human hand gestures. However, this kind of technology raises significant challenges. sEMG signals are difficult to acquire and isolate reliably. The creation of a gesture representative model is hard due to the non-explicit nature of sEMG signals. Several solutions have been proposed for the recognition of sEMG-based hand gestures, but none of them are entirely satisfactory. This study contributes to take a step forward in finding the solution to this problem. A sEMG capturing prototype device was used to collect human hand gestures and a two-step algorithm is proposed to recognize five valid gestures, invalid gestures and non-gestures. The former algorithm step (segmentation) is used for sEMG signal isolation to separate signals containing gestures from signals containing non-gestures. The latter step of the algorithm (recognition) is based on a deep learning method, a convolutional neural network (CNN) that identifies which gesture is in the sEMG signals. The performances of the prototype device and recognition architecture were compared successfully with the off-the-shelf sEMG device Myo. Results indicated that the segmentation process played an important role in the success of the gesture recognition system, excluding sEMG signals containing non-gestures. The proposed system was applied successfully in the control loop of a collaborative robotic application, in which the gesture recognition system achieved an online class recognition rate (CR) of 98%, outperforming similar studies in the literature.
引用
收藏
页数:21
相关论文
共 50 条
  • [21] A Model-Based Human Activity Recognition for Human-Robot Collaboration
    Lee, Sang Uk
    Hofmann, Andreas
    Williams, Brian
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 736 - 743
  • [22] Gaze-Based Intention Recognition for Human-Robot Collaboration
    Belcamino, Valerio
    Takase, Miwa
    Kilina, Mariya
    Carfi, Alessandro
    Shimada, Akira
    Shimizu, Sota
    Mastrogiovanni, Fulvio
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ADVANCED VISUAL INTERFACES, AVI 2024, 2024,
  • [23] Standing-Posture Recognition in Human-Robot Collaboration Based on Deep Learning and the Dempster-Shafer Evidence Theory
    Li, Guan
    Liu, Zhifeng
    Cai, Ligang
    Yan, Jun
    SENSORS, 2020, 20 (04)
  • [24] Gaze-based Attention Recognition for Human-Robot Collaboration
    Prajod, Pooja
    Nicora, Matteo Lavit
    Malosio, Matteo
    Andre, Elisabeth
    PROCEEDINGS OF THE 16TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS, PETRA 2023, 2023, : 140 - 147
  • [25] An adaptive human-robot interaction control method based on electromyography signals
    Zhang B.
    Yao J.
    Zhao X.-G.
    Tan X.-W.
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2020, 37 (12): : 2560 - 2570
  • [26] Learning Human-Robot Collaboration with POMDP
    Lin, Hsien-I
    Nguyen, Xuan-Anh
    2016 16TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS), 2016, : 1238 - 1243
  • [27] Interactive learning in human-robot collaboration
    Ogata, T
    Masago, N
    Sugano, S
    Tani, J
    IROS 2003: PROCEEDINGS OF THE 2003 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2003, : 162 - 167
  • [28] DL-DARE: Deep learning-based different activity recognition for the human-robot interaction environment
    Kansal, Sachin
    Jha, Sagar
    Samal, Prathamesh
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (16): : 12029 - 12037
  • [29] Gesture recognition for human-robot collaboration: A review
    Liu, Hongyi
    Wang, Lihui
    INTERNATIONAL JOURNAL OF INDUSTRIAL ERGONOMICS, 2018, 68 : 355 - 367
  • [30] Using the Rhythm of Nonverbal Human-Robot Interaction as a Signal for Learning
    Andry, Pierre
    Blanchard, Arnaud
    Gaussier, Philippe
    IEEE TRANSACTIONS ON AUTONOMOUS MENTAL DEVELOPMENT, 2011, 3 (01) : 30 - 42