KUNet-An Optimized AI Based Bengali Sign Language Translator for Hearing Impaired and Non Verbal People

被引:0
|
作者
Jim, Abdullah Al Jaid [1 ,2 ]
Rafi, Ibrahim [2 ]
Tiang, Jun Jiat [3 ]
Biswas, Uzzal [2 ]
Abdullah-Al Nahid [2 ]
机构
[1] Trust Univ, Dept Elect & Elect Engn, Barishal 8200, Bangladesh
[2] Khulna Univ, Elect & Commun Engn Discipline, Khulna 9208, Bangladesh
[3] Multimedia Univ, Fac Engn, Ctr Wireless Technol CWT, Cyberjaya 63100, Malaysia
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Sign language; Accuracy; Computational modeling; Support vector machines; Genetic algorithms; Image recognition; Feature extraction; Databases; Assistive technologies; Zero shot learning; Deep learning; Machine learning; Bengali sign language (BdSL); classification; computer vision; deep learning; machine learning; sign language recognition;
D O I
10.1109/ACCESS.2024.3474011
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sign language is the most prevalent form of communication among people with speech and hearing disabilities. The most widely used types of sign language involve the creation of static or dynamic gestures using hand(s). Among many sign languages, Bengali Sign Language (BdSL) is one of the most complicated sign languages to learn and comprehend because of its enormous alphabet, vocabulary, and variation in expression techniques. Existing solutions include learning BdSL or hiring an interpreter. Besides, BdSL interpreter support is hard to come by and expensive (if not voluntary). Disabled people might find it more comfortable to converse with generals implementing machine translation of sign language. Deep learning that mimics the human brain, a subset of the machine learning domain, seems to be a viable solution. For the hearing impaired and non verbal community, computer vision, in particular, may hold the key to finding a solution. Therefore, we have created a novel model, KUNet ("Khulna University Network" a CNN based model), a classification framework optimized by the genetic algorithm (GA), has been proposed to classify BdSL. This model and the dataset contribute to creating a BdSL machine translator. GA-optimized KUNet acquired an accuracy of 99.11% on KU-BdSL. After training the model on KU-BdSL, we demonstrated a comparison of the model with state-of-the-art studies and interpreted the black-box nature of the model using explainable AI (XAI). Additionally, we have found that our model outperformed several well-known models trained on the KU-BdSL dataset. This study will benefit the hearing impaired and non verbal community by allowing them to communicate effortlessly and minimizing their hardship.
引用
收藏
页码:155052 / 155063
页数:12
相关论文
共 30 条
  • [1] Bengali-Sign: A Machine Learning-Based Bengali Sign Language Interpretation for Deaf and Non-Verbal People
    Raihan, Md. Johir
    Labib, Mainul Islam
    Jim, Abdullah Al Jaid
    Tiang, Jun Jiat
    Biswas, Uzzal
    Nahid, Abdullah-Al
    SENSORS, 2024, 24 (16)
  • [2] Sign Language based SMS Generator for Hearing Impaired People
    Kaur, Rubaljit
    Kumar, Parteek
    2017 INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE IN DATA SCIENCE (ICCIDS), 2017,
  • [3] Tamil Sign Language Translator-An Assistive System for Hearing- and Speech-Impaired People
    Jose, Hancy
    Julian, Anitha
    INFORMATION AND COMMUNICATION TECHNOLOGY FOR INTELLIGENT SYSTEMS, ICTIS 2018, VOL 2, 2019, 107 : 249 - 257
  • [4] Sign Language Recognition for Hearing Impaired People based on Hands Symbols Classification
    Kumar, Naresh
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND AUTOMATION (ICCCA), 2017, : 244 - 249
  • [5] Deep Learning-Based Sign Language Recognition for Hearing and Speaking Impaired People
    Alnfiai, Mrim M.
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 36 (02): : 1653 - 1669
  • [6] AI-Enabled Sign Language Detection: Bridging Communication Gaps for the Hearing Impaired
    Vinitha, I
    Kalyani, G.
    Prathyusha, K.
    Pavan, D.
    2024 CONTROL INSTRUMENTATION SYSTEM CONFERENCE, CISCON 2024, 2024,
  • [7] HOW DO HEARING IMPAIRED PEOPLE IN POLAND COMMUNICATE? - THE AXIOLOGY OF POLISH SIGN LANGUAGE (PJM) AND POLISH SPOKEN LANGUAGE IN THE WRITTEN TEXTS OF HEARING IMPAIRED PEOPLE
    Ruta, Karolina
    Wrzesniewska-Pietrzak, Marta
    ROCZNIKI HUMANISTYCZNE, 2015, 63 (06): : 193 - 212
  • [8] AI-Based Sensory Glove System to Recognize Bengali Sign Language (BaSL)
    Begum, Halima
    Chowdhury, Oishik
    Hridoy, Md. Shakib Rahman
    Islam, Muhammed Mazharul
    IEEE ACCESS, 2024, 12 : 145003 - 145017
  • [9] GUIDED LANGUAGE-ACQUISITION OF THE HEARING-IMPAIRED, THE SPECIAL CASE OF BILINGUALISM, VERBAL LANGUAGE VERSUS SIGN LANGUAGE
    HOGGER, B
    DEUTSCHE SPRACHE, 1990, 18 (04): : 310 - 331
  • [10] Evaluation of Chinese Sign Language Animation for Mammography Inspection of Hearing-Impaired People
    Yang, Ou
    Morimoto, Kazunari
    Kuwahara, Noriaki
    2014 IIAI 3RD INTERNATIONAL CONFERENCE ON ADVANCED APPLIED INFORMATICS (IIAI-AAI 2014), 2014, : 831 - 836