Study of Sign Language Recognition Using Wearable Sensors

被引:0
|
作者
Lee, Boon Giin [1 ]
Chung, Wan Young [2 ]
机构
[1] Univ Nottingham Ningbo China, Ningbo 315100, Peoples R China
[2] Pukyong Natl Univ, Busan 48513, South Korea
关键词
Deep learning; Human computer interaction; Sensor fusion; Sign language; Wearable;
D O I
10.1007/978-3-030-68449-5_24
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sign language was designed to allow hearing-impaired person to interact with others. Nonetheless, sign language was not a common practice in the society which produce difficulty in communication with hearing-impaired community. The general existing studies of sign language recognition applied computer vision approach; but the approach was limited by the visual angle and greatly affected by the background lightning. In addition, computer vision involved machine learning (ML) that required collaboration work from team of expertise, along with utilization of high expense hardware. Thus, this study aimed to develop a smart wearable American Sign Language (ASL) interpretation model using deep learning method. The proposed model applied sensor fusion to integrate features from six inertial measurement units (IMUs). Five IMUs were attached on top of the each fingertip whereas an IMU was placed on the back of the hand's palm. The study revealed that ASL gestures recognition with derived features including angular rate, acceleration and orientation achieved mean true sign recognition rate of 99.81%. Conclusively, the proposed smart wearable ASL interpretation model was targeted to assist hearing-impaired person to communicate with society in most convenient way possible.
引用
收藏
页码:229 / 237
页数:9
相关论文
共 50 条
  • [41] Activity recognition using wearable sensors for tracking the elderly
    Stylianos Paraschiakos
    Ricardo Cachucho
    Matthijs Moed
    Diana van Heemst
    Simon Mooijaart
    Eline P. Slagboom
    Arno Knobbe
    Marian Beekman
    User Modeling and User-Adapted Interaction, 2020, 30 : 567 - 605
  • [42] Deep Human Activity Recognition Using Wearable Sensors
    Lawal, Isah A.
    Bano, Sophia
    12TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS (PETRA 2019), 2019, : 45 - 48
  • [43] A Survey on Human Activity Recognition using Wearable Sensors
    Lara, Oscar D.
    Labrador, Miguel A.
    IEEE COMMUNICATIONS SURVEYS AND TUTORIALS, 2013, 15 (03): : 1192 - 1209
  • [44] Challenges on activity recognition techniques using wearable sensors
    Terada, Tsutomu
    Computer Software, 2011, 28 (02) : 43 - 54
  • [45] Robust Activity Recognition using Wearable IMU Sensors
    Prathivadi, Yashaswini
    Wu, Jian
    Bennett, Terrell R.
    Jafari, Roozbeh
    2014 IEEE SENSORS, 2014,
  • [46] Physical Human Activity Recognition Using Wearable Sensors
    Attal, Ferhat
    Mohammed, Samer
    Dedabrishvili, Mariam
    Chamroukhi, Faicel
    Oukhellou, Latifa
    Amirat, Yacine
    SENSORS, 2015, 15 (12) : 31314 - 31338
  • [47] Activity recognition using wearable sensors for tracking the elderly
    Paraschiakos, Stylianos
    Cachucho, Ricardo
    Moed, Matthijs
    van Heemst, Diana
    Mooijaart, Simon
    Slagboom, Eline P.
    Knobbe, Arno
    Beekman, Marian
    USER MODELING AND USER-ADAPTED INTERACTION, 2020, 30 (03) : 567 - 605
  • [48] Activity Recognition using Wearable Sensors for Elder Care
    Hong, Yu-Jin
    Kim, Ig-Jae
    Ahn, Sang Chul
    Kim, Hyoung-Gon
    FGCN: PROCEEDINGS OF THE 2008 SECOND INTERNATIONAL CONFERENCE ON FUTURE GENERATION COMMUNICATION AND NETWORKING, VOLS 1 AND 2, 2008, : 791 - 794
  • [49] Indian sign language recognition using SVM
    Raheja J.L.
    Mishra A.
    Chaudhary A.
    Pattern Recognition and Image Analysis, 2016, 26 (2) : 434 - 441
  • [50] Recognition of Sign Language using Capsule Networks
    Beser, Fuat
    Kizrak, Merve Ayyuce
    Bolat, Bulent
    Yildirim, Tulay
    2018 26TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2018,