SHAPE: a dataset for hand gesture recognition

被引:0
|
作者
Dang, Tuan Linh [1 ]
Nguyen, Huu Thang [2 ]
Dao, Duc Manh [1 ]
Nguyen, Hoang Vu [1 ]
Luong, Duc Long [1 ]
Nguyen, Ba Tuan [1 ]
Kim, Suntae [3 ]
Monet, Nicolas [3 ]
机构
[1] School of Information and Communications Technology, Hanoi University of Science and Technology, 01 Dai Co Viet street, Hanoi,100000, Viet Nam
[2] School of Electrical Engineering, Hanoi University of Science and Technology, 01 Dai Co Viet street, Hanoi,100000, Viet Nam
[3] Avatar, NAVER CLOVA, 6 Buljeong-ro, Bundang-gu, Gyeonggi-do, Seongnam-si, Korea, Republic of
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Hand gestures are becoming an important part of the communication method between humans and machines in the era of fast-paced urbanization. This paper introduces a new standard dataset for hand gesture recognition, Static HAnd PosturE (SHAPE), with adequate side, variation, and practicality. Compared with the previous datasets, our dataset has more classes, subjects, or scenes than other datasets. In addition, the SHAPE dataset is also one of the first datasets to focus on Asian subjects with Asian hand gestures. The SHAPE dataset contains more than 34,000 images collected from 20 distinct subjects with different clothes and backgrounds. A recognition architecture is also presented to investigate the proposed dataset. The architecture consists of two phases that are the hand detection phase for preprocessing and the classification phase by customized state-of-the-art deep neural network models. This paper investigates not only the high accuracy, but also the lightweight hand gesture recognition models that are suitable for resource-constrained devices such as portable edge devices. The promising application of this study is to create a human–machine interface that solves the problem of insufficient space for a keyboard or a mouse in small devices. Our experiments showed that the proposed architecture could obtain high accuracy with the self-built dataset. Details of our dataset can be seen online at https://users.soict.hust.edu.vn/linhdt/dataset/. © 2022, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.
引用
收藏
页码:21849 / 21862
相关论文
共 50 条
  • [31] Gesture MNIST: A New Free-Hand Gesture Dataset
    Schak, Monika
    Gepperth, Alexander
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 657 - 668
  • [32] HGM-4: A new multi-cameras dataset for hand gesture recognition
    Vinh Truong Hoang
    DATA IN BRIEF, 2020, 30
  • [33] Multi-day dataset of forearm and wrist electromyogram for hand gesture recognition and biometrics
    Ashirbad Pradhan
    Jiayuan He
    Ning Jiang
    Scientific Data, 9
  • [34] Multi-day dataset of forearm and wrist electromyogram for hand gesture recognition and biometrics
    Pradhan, Ashirbad
    He, Jiayuan
    Jiang, Ning
    SCIENTIFIC DATA, 2022, 9 (01)
  • [35] HAND GESTURE RECOGNITION: AN OVERVIEW
    Yang, Shuai
    Premaratne, Prashan
    Vial, Peter
    2013 5TH IEEE INTERNATIONAL CONFERENCE ON BROADBAND NETWORK & MULTIMEDIA TECHNOLOGY (IC-BNMT), 2013, : 63 - 69
  • [36] Recognition of Static Hand Gesture
    Sadeddine, Khadidja
    Djeradi, Rachida
    Chelali, Fatma Zohra
    Djeradi, Amar
    PROCEEDINGS OF 2018 6TH INTERNATIONAL CONFERENCE ON MULTIMEDIA COMPUTING AND SYSTEMS (ICMCS), 2018, : 368 - 373
  • [37] Hand Tracking and Gesture Recognition
    Dhote, Anagha
    Badwaik, S. C.
    2015 INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING (ICPC), 2015,
  • [38] HAND AND GESTURE RECOGNITION TECHNIQUES
    Miglani, Himanshu
    Sharma, Hari Mohan
    Husain, Agha Imran
    IIOAB JOURNAL, 2019, 10 (02) : 55 - 60
  • [39] A Method for Hand Gesture Recognition
    Shukla, Jaya
    Dwivedi, Ashutosh
    2014 FOURTH INTERNATIONAL CONFERENCE ON COMMUNICATION SYSTEMS AND NETWORK TECHNOLOGIES (CSNT), 2014, : 919 - 923
  • [40] A Survey on Hand Gesture Recognition
    Chen, Lingchen
    Wang, Feng
    Deng, Hui
    Ji, Kaifan
    2013 INTERNATIONAL CONFERENCE ON COMPUTER SCIENCES AND APPLICATIONS (CSA), 2013, : 313 - 316