Amaging: Acoustic Hand Imaging for Self-adaptive Gesture Recognition

被引:25
|
作者
Wang, Penghao [1 ]
Jiang, Ruobing [1 ]
Liu, Chao [1 ]
机构
[1] Ocean Univ China, Dept Comp Sci & Technol, Qingdao, Peoples R China
基金
中国国家自然科学基金;
关键词
acoustic hand imaging; ubiquitous gesture recognition; mobile interference filtering; adaptive gesture response;
D O I
10.1109/INFOCOM48880.2022.9796906
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
A practical challenge common to state-of-the-art acoustic gesture recognition techniques is to adaptively respond to intended gestures rather than unintended motions during the real-time tracking on human motion flow. Besides, other disadvantages of under-expanded sensing space and vulnerability against mobile interference jointly impair the pervasiveness of acoustic sensing. Instead of struggling along the bottlenecked routine, we innovatively open up an independent sensing dimension of acoustic 2-D hand-shape imaging. We first deductively demonstrate the feasibility of acoustic imaging through multiple viewpoints dynamically generated by hand movement. Amaging, hand-shape imaging triggered gesture recognition, is then proposed to offer adaptive gesture responses. Digital Dechirp is novelly performed to largely reduce computational cost in demodulation and pulse compression. Mobile interference is filtered by Moving Target Indication. Multi-frame macro-scale imaging with Joint Time-Frequency Analysis is performed to eliminate image blur while maintaining adequate resolution. Amaging features revolutionary multiplicative expansion on sensing capability and dual dimensional parallelism for both hand-shape and gesture-trajectory recognition. Extensive experiments and simulations demonstrate Amaging's distinguishing hand-shape imaging performance, independent from diverse hand movement and immune against mobile interference. 96% hand-shape recognition rate is achieved with ResNet18 and 60x augmentation rate.
引用
收藏
页码:80 / 89
页数:10
相关论文
共 50 条
  • [21] Self-Adaptive Event Recognition for Intelligent Transport Management
    Artikis, Alexander
    Weidlich, Matthias
    Gal, Avigdor
    Kalogeraki, Vana
    Gunopulos, Dimitrios
    2013 IEEE INTERNATIONAL CONFERENCE ON BIG DATA, 2013,
  • [22] Locator: A Self-adaptive Framework for the Recognition of Relevant Places
    Baumann, Paul
    Klaus, Johannes
    Santini, Silvia
    PROCEEDINGS OF THE 2014 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING (UBICOMP'14 ADJUNCT), 2014, : 9 - 12
  • [23] Recognition of Static Hand Gesture
    Sadeddine, Khadidja
    Djeradi, Rachida
    Chelali, Fatma Zohra
    Djeradi, Amar
    PROCEEDINGS OF 2018 6TH INTERNATIONAL CONFERENCE ON MULTIMEDIA COMPUTING AND SYSTEMS (ICMCS), 2018, : 368 - 373
  • [24] Hand Tracking and Gesture Recognition
    Dhote, Anagha
    Badwaik, S. C.
    2015 INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING (ICPC), 2015,
  • [25] HAND AND GESTURE RECOGNITION TECHNIQUES
    Miglani, Himanshu
    Sharma, Hari Mohan
    Husain, Agha Imran
    IIOAB JOURNAL, 2019, 10 (02) : 55 - 60
  • [26] A Method for Hand Gesture Recognition
    Shukla, Jaya
    Dwivedi, Ashutosh
    2014 FOURTH INTERNATIONAL CONFERENCE ON COMMUNICATION SYSTEMS AND NETWORK TECHNOLOGIES (CSNT), 2014, : 919 - 923
  • [27] A Novel Self-adaptive Bionic Robot Hand With Flexible Fingers
    Zhang, Xiaohua
    Tian, Ye
    Zhu, Jinying
    She, Haotian
    Li, Xin
    Sun, Wentao
    2017 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (IEEE ROBIO 2017), 2017, : 2579 - 2583
  • [28] A Survey on Hand Gesture Recognition
    Chen, Lingchen
    Wang, Feng
    Deng, Hui
    Ji, Kaifan
    2013 INTERNATIONAL CONFERENCE ON COMPUTER SCIENCES AND APPLICATIONS (CSA), 2013, : 313 - 316
  • [29] DYNAMIC HAND GESTURE RECOGNITION
    Rokade-Shinde, Rajeshree
    Sonawane, Jayashree
    2016 INTERNATIONAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (ICONSIP), 2016,
  • [30] A flexible self-adaptive underactuated hand with series passive joints
    Luo, Chao
    Zhang, Wenzeng
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2018, 45 (04): : 516 - 525