Baby Cry Recognition by BCRNet Using Transfer Learning and Deep Feature Fusion

被引:2
|
作者
Zhang, Ke [1 ]
Ting, Hua-Nong [1 ,2 ]
Choo, Yao-Mun [3 ]
机构
[1] Univ Malaya, Fac Engn, Dept Biomed Engn, Kuala Lumpur 50603, Malaysia
[2] Jining Med Univ, Fac Med Engn, Jining 272067, Shandong, Peoples R China
[3] Univ Malaya, Fac Med, Dept Paediat, Kuala Lumpur 50603, Malaysia
关键词
Baby cry; recognition; transfer learning; autoencoder; feature fusion; deep neural network; CLASSIFICATION;
D O I
10.1109/ACCESS.2023.3330789
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep learning theory has made remarkable advancements in the field of baby cry recognition, significantly enhancing its accuracy. Nonetheless, existing research faces two challenges. Firstly, the limited size of the database increases the risk of overfitting for a deep learning model. Secondly, the integration of multi-domain features has been neglected. To address these issues, a novel approach called BCRNet is proposed, which combines transfer learning and feature fusion. The BCRNet model takes multi-domain features as input and extracts deep features using a transfer learning model. Subsequently, a multilayer autoencoder is utilized for feature reduction, and a Support Vector Machine (SVM) is employed to select the transfer learning model with the highest classification accuracy. Then two features are concatenated to form fused features. Finally, the fused features are fed into a deep neural network for classification. Experimental results show that the proposed model is effective in mitigating the model overfitting problem due to small datasets. The fused features of the proposed method are better than the existing methods using single domain features.
引用
收藏
页码:126251 / 126262
页数:12
相关论文
共 50 条
  • [41] Music Feature Recognition and Classification Using a Deep Learning Algorithm
    Xu, Lihong
    Zhang, Shenghuan
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2023, 22 (03)
  • [42] Feature Extraction Using Deep Learning for Food Type Recognition
    Farooq, Muhammad
    Sazonov, Edward
    BIOINFORMATICS AND BIOMEDICAL ENGINEERING, IWBBIO 2017, PT I, 2017, 10208 : 464 - 472
  • [43] Medical equipment recognition using deep transfer learning
    Wong, Shi-Ting
    Too, Chian-Wen
    Yap, Wun-She
    Khor, Kok-Chin
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 43 (01) : 1001 - 1010
  • [44] Epileptic Classification With Deep-Transfer-Learning-Based Feature Fusion Algorithm
    Cao, Jiuwen
    Hu, Dinghan
    Wang, Yaomin
    Wang, Jianzhong
    Lei, Baiying
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (02) : 684 - 695
  • [45] Implementation of multimodal biometric recognition via multi-feature deep learning networks and feature fusion
    Tiong, Leslie Ching Ow
    Kim, Seong Tae
    Ro, Yong Man
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (16) : 22743 - 22772
  • [46] Implementation of multimodal biometric recognition via multi-feature deep learning networks and feature fusion
    Leslie Ching Ow Tiong
    Seong Tae Kim
    Yong Man Ro
    Multimedia Tools and Applications, 2019, 78 : 22743 - 22772
  • [47] Feature Extraction from Several Angular Faces Using a Deep Learning Based Fusion Technique for Face Recognition
    Charoqdouz, E.
    Hassanpour, H.
    INTERNATIONAL JOURNAL OF ENGINEERING, 2023, 36 (08): : 1548 - 1555
  • [48] Feature Extraction from Several Angular Faces Using a Deep Learning Based Fusion Technique for Face Recognition
    Charoqdouz E.
    Hassanpour H.
    International Journal of Engineering, Transactions B: Applications, 2023, 36 (08): : 1548 - 1555
  • [49] Wearable Sensor-based physical activity intensity recognition using deep learning feature engineering fusion
    Qiu, Jia-Gang
    Li, Yi
    Li, Hui
    Wang, Zhen
    Pang, Lei
    Sun, Gang
    MEASUREMENT, 2025, 241
  • [50] Multilevel Scattering Center and Deep Feature Fusion Learning Framework for SAR Target Recognition
    Liu, Zhunga
    Wang, Longfei
    Wen, Zaidao
    Li, Kun
    Pan, Quan
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60