Study of Sign Language Recognition Using Wearable Sensors

被引:0
|
作者
Lee, Boon Giin [1 ]
Chung, Wan Young [2 ]
机构
[1] Univ Nottingham Ningbo China, Ningbo 315100, Peoples R China
[2] Pukyong Natl Univ, Busan 48513, South Korea
关键词
Deep learning; Human computer interaction; Sensor fusion; Sign language; Wearable;
D O I
10.1007/978-3-030-68449-5_24
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sign language was designed to allow hearing-impaired person to interact with others. Nonetheless, sign language was not a common practice in the society which produce difficulty in communication with hearing-impaired community. The general existing studies of sign language recognition applied computer vision approach; but the approach was limited by the visual angle and greatly affected by the background lightning. In addition, computer vision involved machine learning (ML) that required collaboration work from team of expertise, along with utilization of high expense hardware. Thus, this study aimed to develop a smart wearable American Sign Language (ASL) interpretation model using deep learning method. The proposed model applied sensor fusion to integrate features from six inertial measurement units (IMUs). Five IMUs were attached on top of the each fingertip whereas an IMU was placed on the back of the hand's palm. The study revealed that ASL gestures recognition with derived features including angular rate, acceleration and orientation achieved mean true sign recognition rate of 99.81%. Conclusively, the proposed smart wearable ASL interpretation model was targeted to assist hearing-impaired person to communicate with society in most convenient way possible.
引用
收藏
页码:229 / 237
页数:9
相关论文
共 50 条
  • [31] A study of Japanese sign language recognition using human skeleton data
    Takazume, Alyssa
    Yata, Noriko
    Manabe, Yoshitsugu
    INTERNATIONAL WORKSHOP ON ADVANCED IMAGING TECHNOLOGY, IWAIT 2023, 2023, 12592
  • [32] A study of Japanese sign language recognition using human skeleton data
    Takazume, Alyssa
    Yata, Noriko
    Manabe, Yoshitsugu
    Proceedings of SPIE - The International Society for Optical Engineering, 2023, 12592
  • [33] American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions
    Gu, Yutong
    Zheng, Chao
    Todoh, Masahiro
    Zha, Fusheng
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [34] A Wearable System for Recognizing American Sign Language in Real-Time Using IMU and Surface EMG Sensors
    Wu, Jian
    Sun, Lu
    Jafari, Roozbeh
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2016, 20 (05) : 1281 - 1290
  • [35] Gait Recognition Using Wearable Motion Recording Sensors
    Gafurov, Davrondzhon
    Snekkenes, Einar
    EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2009,
  • [36] Recognition of gait cycle phases using wearable sensors
    Mohammed, Samer
    Same, Allou
    Oukhellou, Latifa
    Kong, Kyoungchul
    Huo, Weiguang
    Amirat, Yacine
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 75 : 50 - 59
  • [37] Stress Recognition using Wearable Sensors and Mobile Phones
    Sano, Akane
    Picard, Rosalind W.
    2013 HUMAINE ASSOCIATION CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2013, : 671 - 676
  • [38] Human Activity Recognition Using Wearable Accelerometer Sensors
    Zubair, Muhammad
    Song, Kibong
    Yoon, Changwoo
    2016 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS-ASIA (ICCE-ASIA), 2016,
  • [39] Flexible Gesture Recognition Using Wearable Inertial Sensors
    Abualola, Huda
    Al Ghothani, Hanin
    Eddin, Abdulrahim Naser
    Almoosa, Nawaf
    Poon, Kin
    2016 IEEE 59TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS (MWSCAS), 2016, : 810 - 813
  • [40] Gait Recognition Using Wearable Motion Recording Sensors
    Davrondzhon Gafurov
    Einar Snekkenes
    EURASIP Journal on Advances in Signal Processing, 2009