A computer vision-based system for recognition and classification of Urdu sign language dataset

被引:0
|
作者
Zahid H. [1 ,5 ]
Rashid M. [2 ]
Syed S.A. [3 ]
Ullah R. [4 ]
Asif M. [5 ]
Khan M. [3 ]
Mujeeb A.A. [6 ]
Khan A.H. [6 ]
机构
[1] Biomedical Engineering Department and Electrical Engineering Department, Ziauddin University, Karachi
[2] Electrical Engineering Department and Software Engineering Department, Ziauddin University, Karachi
[3] Biomedical Engineering Department, Sir Syed University of Engineering and Technology, Karachi
[4] Optimizia, Karachi
[5] Electrical Engineering Department, Ziauddin University, Karachi
[6] Biomedical Engineering Department, Ziauddin University, Karachi
关键词
Bag of words; KNN; Pattern recognition; Random Forest; Sign language; SVM; Urdu sign language;
D O I
10.7717/PEERJ-CS.1174
中图分类号
学科分类号
摘要
Human beings rely heavily on social communication as one of the major aspects of communication. Language is the most effective means of verbal and nonverbal communication and association. To bridge the communication gap between deaf people communities, and non-deaf people, sign language is widely used. According to the World Federation of the Deaf, there are about 70 million deaf people present around the globe and about 300 sign languages being used. Hence, the structural form of the hand gestures involving visual motions and signs is used as a communication system to help the deaf and speech-impaired community for daily interaction. The aim is to collect a dataset of Urdu sign language (USL) and test it through a machine learning classifier. The overview of the proposed system is divided into four main stages i.e., data collection, data acquisition, training model ad testing model. The USL dataset which is comprised of 1,560 images was created by photographing various hand positions using a camera. This work provides a strategy for automated identification of USL numbers based on a bag-of-words (BoW) paradigm. For classification purposes, support vector machine (SVM), Random Forest, and K-nearest neighbor (K-NN) are used with the BoWhistogram bin frequencies as characteristics. The proposed technique outperforms others in number classification, attaining the accuracies of 88%, 90%, and 84% for the random forest, SVM, and K-NN respectively. © 2022 Zahid et al.
引用
收藏
相关论文
共 50 条
  • [41] Vision-based hand gesture recognition using deep learning for the interpretation of sign language
    Sharma, Sakshi
    Singh, Sukhwinder
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 182 (182)
  • [42] Research on road recognition and initiative security system of computer vision-based
    Chai, Y
    Liao, CJ
    Guo, MY
    Huang, XY
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2003, : 260 - 264
  • [43] Vision-based gesture recognition system for human-computer interaction
    Trigueiros, Paulo
    Ribeiro, Fernando
    Reis, Luis Paulo
    COMPUTATIONAL VISION AND MEDICAL IMAGE PROCESSING IV, 2014, : 137 - 142
  • [44] Realtime Sign Language Recognition Using Computer Vision and AI
    Serrano, Gabriel
    Kwak, Daehan
    2023 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE, CSCI 2023, 2023, : 1214 - 1220
  • [45] Vision-based Hand Gesture Recognition for Indian Sign Language Using Convolution Neural Network
    Gangrade, Jayesh
    Bharti, Jyoti
    IETE JOURNAL OF RESEARCH, 2023, 69 (02) : 723 - 732
  • [46] Computer vision-based recognition of driver distraction: A review
    Moslemi, Negar
    Soryani, Mohsen
    Azmi, Reza
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2021, 33 (24):
  • [47] Application of Computer Vision-based Chinese Painting Stroke Recognition and Simulation System
    Yan X.
    Chen J.
    Li W.
    Zhang Z.
    Computer-Aided Design and Applications, 2024, 21 (S15): : 35 - 53
  • [48] Computer Vision-based Survey on Human Activity Recognition System, Challenges and Applications
    Manaf, Abdul F.
    Singh, Sukhwinder
    ICSPC'21: 2021 3RD INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATION (ICPSC), 2021, : 110 - 114
  • [49] Vision-Based Traffic Hand Sign Recognition for Driver Assistance
    Madake, Jyoti
    Salway, Hrishikesh
    Sardey, Chaitanya
    Bhatlawande, Shripad
    Shilaskar, Swati
    Proceedings - 2022 OITS International Conference on Information Technology, OCIT 2022, 2022, : 580 - 587
  • [50] Dataset of Pakistan Sign Language and Automatic Recognition of Hand Configuration of Urdu Alphabet through Machine Learning
    Imran, Ali
    Razzaq, Abdul
    Baig, Irfan Ahmad
    Hussain, Aamir
    Shahid, Sharaiz
    Rehman, Tausif-ur
    DATA IN BRIEF, 2021, 36