Automatic American sign language prediction for static and dynamic gestures using KFM-CNN

被引:0
|
作者
Thushara, A. [1 ]
Hani, Reymond Hakkim Baisil [1 ]
Mukundan, Manu [1 ]
机构
[1] Computer Science and Engineering, TKM College of Engineering, Kollam, APJ Abdul Kalam Technological University, Thiruvananthapuram, India
关键词
Deep learning - Human computer interaction - Image segmentation - Mapping - Palmprint recognition;
D O I
10.1007/s00500-024-09936-0
中图分类号
学科分类号
摘要
For human–computer interaction, one of the most important tools is Sign Language Recognition in which one of the significant research topics is static Hand Gesture (HG) and dynamic Hand Gesture Recognition (HGR) of American Sign Language (ASL). However, recognizing them was challenging owing to improper contour models. Thus, this paper proposes a new model for ASL Static and dynamic HGR utilizing Deep Learning (DL). Primarily, the videos from the dataset are collected and transformed into image frames and then given to noise filtering utilizing an Alternative Window Size-Lone Diagonal Sorting Algorithm. The RGB noises are removed in noise filtering and given to Canopy Algorithm-based Minibatch K-Means Clustering clustering of static and dynamic gesture images. After that, by utilizing the YCbCr Palm point and Finger speed-based Threshold in the Region seed Grow Segmentation algorithm, the clustered images are segmented that segments both palm and fingers. Then, the features are extracted, and these features are given to the Kohnen Feature Mapping-based CNN classifier. From the Classifier, HG-recognized character outcomes are obtained. In a software environment, the novel model is implemented; also, for exhibiting the proposed technique’s superiority, the outcomes are analogized with the prevailing approaches.
引用
收藏
页码:11703 / 11715
页数:12
相关论文
共 50 条
  • [1] Automatic detection of relevant head gestures in American sign language communication
    Erdem, UM
    Sclaroff, S
    16TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL I, PROCEEDINGS, 2002, : 460 - 463
  • [2] Automatic detection of relevant head gestures in American sign language communication
    Erdem, Ugur Murat
    Sclaroff, Stan
    Proceedings - International Conference on Pattern Recognition, 2002, 16 (01): : 460 - 463
  • [3] Radar-Based Recognition of Static Hand Gestures in American Sign Language
    Schuessler, Christian
    Zhang, Wenxuan
    Braunig, Johanna
    Hoffman, Marcel
    Stelzig, Michael
    Vossiek, Martin
    2024 IEEE RADAR CONFERENCE, RADARCONF 2024, 2024,
  • [4] Towards Automatic Recognition of Sign Language Gestures Using Kinect 2.0
    Ryumin, Dmitry
    Karpov, Alexey A.
    UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION: DESIGNING NOVEL INTERACTIONS, PT II, 2017, 10278 : 89 - 101
  • [5] Significant Gestures: A History of American Sign Language
    Zikic, Bojan
    ETNOANTROPOLOSKI PROBLEMI-ISSUES IN ETHNOLOGY AND ANTHROPOLOGY, 2008, 3 (02): : 169 - 172
  • [6] Sign Language Static Gestures Recognition Tool Prototype
    Imashev, Alfarabi
    2017 11TH IEEE INTERNATIONAL CONFERENCE ON APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES (AICT 2017), 2017, : 53 - 56
  • [7] Significant gestures: A history of American sign language
    Greenwald, Brian H.
    HISTORIAN, 2008, 70 (03): : 558 - 559
  • [8] Static Gestures Recognition for Brazilian Sign Language with Kinect Sensor
    Carneiro, Sergio Bessa
    Santos, Edson D. F. de M.
    Barbosa, Talles M. de A.
    Ferreira, Jose O.
    Soares Alcala, Symone G.
    Da Rocha, Adson F.
    2016 IEEE SENSORS, 2016,
  • [9] Recognition of Static Gestures applied to Brazilian Sign Language (Libras)
    Bastos, Igor L. O.
    Angelo, Michele F.
    Loula, Angelo C.
    2015 28TH SIBGRAPI CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES, 2015, : 305 - 312
  • [10] Grammar, gestures, and meaning in American Sign Language.
    Slobin, DI
    LANGUAGE, 2006, 82 (01) : 176 - 179