A Dynamic Head Gesture Recognition Method for Real-time Intention Inference and Its Application to Visual Human-robot Interaction

被引:1
|
作者
Xie, Jialong [1 ]
Zhang, Botao [1 ]
Lu, Qiang [1 ]
Borisov, Oleg [2 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou, Peoples R China
[2] ITMO Univ, Fac Control Syst & Robot, St Petersburg, Russia
基金
中国国家自然科学基金;
关键词
Computer vision; deep learning; head gesture; human-robot interaction; MOTION;
D O I
10.1007/s12555-022-0051-6
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Head gesture is a natural and non-verbal communication method for human-computer and human-robot interaction, conveying attitudes and intentions. However, the existing vision-based recognition methods cannot meet the precision and robustness of interaction requirements. Due to the limited computational resources, applying most high-accuracy methods to mobile and onboard devices is challenging. Moreover, the wearable device-based approach is inconvenient and expensive. To deal with these problems, an end-to-end two-stream fusion network named TSIR3D is proposed to identify head gestures from videos for analyzing human attitudes and intentions. Inspired by Inception and ResNet architecture, the width and depth of the network are increased to capture motion features sufficiently. Meanwhile, convolutional kernels are expanded from the spatial domain to the spatiotemporal domain for temporal feature extraction. The fusion position of the two-stream channel is explored under an accuracy/complexity trade-off to a certain extent. Furthermore, a dynamic head gesture dataset named DHG and a behavior tree are designed for human-robot interaction. Experimental results show that the proposed method has advantages in real-time performance on the remote server or the onboard computer. Furthermore, its accuracy on the DHG can surpass most state-of-the-art vision-based methods and is even better than most previous approaches based on head-mounted sensors. Finally, TSIR3D is applied on Pepper Robot equipped with Jetson TX2.
引用
收藏
页码:252 / 264
页数:13
相关论文
共 50 条
  • [41] Head and Eye Egocentric Gesture Recognition for Human-Robot Interaction Using Eyewear Cameras
    Marina-Miranda, Javier
    Javier Traver, V
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (03) : 7067 - 7074
  • [42] When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry
    Laurel D. Riek
    Philip C. Paul
    Peter Robinson
    Journal on Multimodal User Interfaces, 2010, 3 : 99 - 108
  • [43] When my robot smiles at me Enabling human-robot rapport via real-time head gesture mimicry
    Riek, Laurel D.
    Paul, Philip C.
    Robinson, Peter
    JOURNAL ON MULTIMODAL USER INTERFACES, 2010, 3 (1-2) : 99 - 108
  • [44] Real-Time Recognition of Extroversion-Introversion Trait in Context of Human-Robot Interaction
    Zafar, Zuhair
    Paplu, Sarwar Hussain
    Berns, Karsten
    ADVANCES IN SERVICE AND INDUSTRIAL ROBOTICS, RAAD 2018, 2019, 67 : 63 - 70
  • [45] A Robust Myoelectric Gesture Recognition Method for Enhancing the Reliability of Human-Robot Interaction
    Wang, Long
    Chen, Zhangyi
    Zhou, Shanjun
    Yu, Yilin
    Li, Xiaoling
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2025, 10 (04): : 3731 - 3738
  • [46] A Social Robot Architecture for Personalized Real-Time Human-Robot Interaction
    Foggia, Pasquale
    Greco, Antonio
    Roberto, Antonio
    Saggese, Alessia
    Vento, Mario
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (24): : 22427 - 22439
  • [47] Compact Real-time Avoidance on a Humanoid Robot for Human-robot Interaction
    Dong Hai Phuong Nguyen
    Hoffmann, Matej
    Roncone, Alessandro
    Pattacini, Ugo
    Metta, Giorgio
    HRI '18: PROCEEDINGS OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2018, : 416 - 424
  • [48] A Method of Intention Estimation for Human-Robot Interaction
    Luo, Jing
    Liu, Chao
    Wang, Ning
    Yang, Chenguang
    ADVANCES IN COMPUTATIONAL INTELLIGENCE SYSTEMS (UKCI 2019), 2020, 1043 : 69 - 80
  • [49] Dynamic Gesture Recognition for Human Robot Interaction
    Lee-Ferng, Jong
    Ruiz-del-Solar, Javier
    Verschae, Rodrigo
    Correa, Mauricio
    2009 6TH LATIN AMERICAN ROBOTICS SYMPOSIUM, 2009, : 57 - 64
  • [50] Human-robot interaction in Industry 4.0 based on an Internet of Things real-time gesture control system
    Roda-Sanchez, Luis
    Olivares, Teresa
    Garrido-Hidalgo, Celia
    Luis de la Vara, Jose
    Fernandez-Caballero, Antonio
    INTEGRATED COMPUTER-AIDED ENGINEERING, 2021, 28 (02) : 159 - 175