Deep Learning Based Mobilenet and Multi-Head Attention Model for Facial Expression Recognition

被引:0
|
作者
Nouisser, Aicha [1 ]
Zouari, Ramzi [2 ]
Kherallah, Monji [3 ]
机构
[1] Univ Gafsa, Fac Sci Gafsa, Gafsa, Tunisia
[2] Univ Sfax, Natl Sch Engn Sfax, Sfax, Tunisia
[3] Univ Sfax, Fac Sci Sfax, Sfax, Tunisia
关键词
Depthwise; pointwise; attention; balanced; skip connection; transfer learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Facial expressions is an intuitive reflection of a person's emotional state, and it is one of the most important forms of interpersonal communication. Due to the complexity and variability of human facial expressions, traditional methods based on handcrafted feature extraction have shown insufficient performances. For this purpose, we proposed a new system of facial expression recognition based on MobileNet model with the addition of skip connections to prevent the degradation in performance in deeper architectures. Moreover, multi-head attention mechanism was applied to concentrate the processing on the most relevant parts of the image. The experiments were conducted on FER2013 database, which is imbalanced and includes ambiguities in some images containing synthetic faces. We applied a pre-processing step of face detection to eliminate wrong images, and we implemented both SMOTE and Near-Miss algorithms to get a balanced dataset and prevent the model to being biased. The experimental results showed the effectiveness of the proposed framework which achieved the recognition rate of 96.02% when applying multi-head attention mechanism.
引用
收藏
页码:485 / 491
页数:7
相关论文
共 50 条
  • [31] Adversarial Transfer Learning for Named Entity Recognition Based on Multi-Head Attention Mechanism and Feature Fusion
    Zhao, Dandan
    Zhang, Pan
    Meng, Jiana
    Wu, Yue
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 272 - 284
  • [32] A Continuous Facial Expression Recognition Model based on Deep Learning Method
    Lin, Szu-Yin
    Tseng, Yi-Wen
    Wu, Chang-Rong
    Kung, Yun-Ching
    Chen, Yi-Zhen
    Wu, Chao-Ming
    2019 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATION SYSTEMS (ISPACS), 2019,
  • [33] Hierarchical Multi-Task Learning Based on Interactive Multi-Head Attention Feature Fusion for Speech Depression Recognition
    Xing, Yujuan
    He, Ruifang
    Zhang, Chengwen
    Tan, Ping
    IEEE ACCESS, 2025, 13 : 51208 - 51219
  • [34] Recognizing facial expressions based on pyramid multi-head grid and spatial attention network
    Zhang, Jianyang
    Wang, Wei
    Li, Xiangyu
    Han, Yanjiang
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2024, 244
  • [35] ConF: A Deep Learning Model Based on BiLSTM, CNN, and Cross Multi-Head Attention Mechanism for Noncoding RNA Family Prediction
    Teragawa, Shoryu
    Wang, Lei
    BIOMOLECULES, 2023, 13 (11)
  • [36] A multi-head adjacent attention-based pyramid layered model for nested named entity recognition
    Shengmin Cui
    Inwhee Joe
    Neural Computing and Applications, 2023, 35 : 2561 - 2574
  • [37] A multi-head adjacent attention-based pyramid layered model for nested named entity recognition
    Cui, Shengmin
    Joe, Inwhee
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (03): : 2561 - 2574
  • [38] A Novel Knowledge Tracing Model Based on Collaborative Multi-Head Attention
    Zhang Wei
    Qu Kaiyuan
    Han Yahui
    Tan Longan
    6TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE, ICIAI2022, 2022, : 210 - 215
  • [39] Deep Multi-Head Attention Network for Aspect-Based Sentiment Analysis
    Yan, Danfeng
    Chen, Jiyuan
    Cui, Jianfei
    Shan, Ao
    Shi, Wenting
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 695 - 700
  • [40] Speech Emotion Classification with Parallel Architecture of Deep Learning and Multi-Head Attention Transformer
    Nguyen, An Hoang
    Trang, Kien
    Thao, Nguyen Gia Minh
    Vuong, Bao Quoc
    Ton-That, Long
    2023 62ND ANNUAL CONFERENCE OF THE SOCIETY OF INSTRUMENT AND CONTROL ENGINEERS, SICE, 2023, : 1549 - 1554