Multimodal driver emotion recognition using motor activity and facial expressions

被引:1
|
作者
Espino-Salinas, Carlos H. [1 ]
Luna-Garcia, Huizilopoztli [1 ]
Celaya-Padilla, Jose M. [1 ]
Barria-Huidobro, Cristian [2 ]
Gamboa Rosales, Nadia Karina [1 ]
Rondon, David [3 ]
Villalba-Condori, Klinge Orlando [4 ]
机构
[1] Univ Autonoma Zacatecas, Unidad Acad Ingn Elect, Lab Tecnol Interact & Experiencia Usuario, Zacatecas, Mexico
[2] Univ Mayor Chile, Ctr Invest Cibersegur, Providencia, Chile
[3] Univ Continental, Dept Estudios Gen, Arequipa, Peru
[4] Catolica Univ Santa Maria, Vicerrectorado Invest, Arequipa, Peru
来源
关键词
facial emotion recognition; motor activity; driver emotions; transfer learning; convolutional neural network; ADAS; NEURAL-NETWORK; PERFORMANCE; SYSTEM;
D O I
10.3389/frai.2024.1467051
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Driving performance can be significantly impacted when a person experiences intense emotions behind the wheel. Research shows that emotions such as anger, sadness, agitation, and joy can increase the risk of traffic accidents. This study introduces a methodology to recognize four specific emotions using an intelligent model that processes and analyzes signals from motor activity and driver behavior, which are generated by interactions with basic driving elements, along with facial geometry images captured during emotion induction. The research applies machine learning to identify the most relevant motor activity signals for emotion recognition. Furthermore, a pre-trained Convolutional Neural Network (CNN) model is employed to extract probability vectors from images corresponding to the four emotions under investigation. These data sources are integrated through a unidimensional network for emotion classification. The main proposal of this research was to develop a multimodal intelligent model that combines motor activity signals and facial geometry images to accurately recognize four specific emotions (anger, sadness, agitation, and joy) in drivers, achieving a 96.0% accuracy in a simulated environment. The study confirmed a significant relationship between drivers' motor activity, behavior, facial geometry, and the induced emotions.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Fusion of Facial Expressions and EEG for Multimodal Emotion Recognition
    Huang, Yongrui
    Yang, Jianhao
    Liao, Pengkai
    Pan, Jiahui
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2017, 2017
  • [2] Emotion Recognition Using Facial Expressions
    Jasuja, Arush
    Rathee, Sonia
    INTERNATIONAL JOURNAL OF INFORMATION RETRIEVAL RESEARCH, 2021, 11 (03) : 1 - 17
  • [3] Emotion recognition using facial expressions
    Tarnowski, Pawel
    Kolodziej, Marcin
    Majkowski, Andrzej
    Rak, Remigiusz J.
    INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE (ICCS 2017), 2017, 108 : 1175 - 1184
  • [4] Emotion Recognition from EEG and Facial Expressions: a Multimodal Approach
    Chaparro, Valentina
    Gomez, Alejandro
    Salgado, Alejandro
    Quintero, O. Lucia
    Lopez, Natalia
    Villa, Luisa F.
    2018 40TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2018, : 530 - 533
  • [5] A multimodal emotion recognition method based on facial expressions and electroencephalography
    Tan, Ying
    Sun, Zhe
    Duan, Feng
    Sole-Casals, Jordi
    Caiafa, Cesar F.
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2021, 70
  • [6] Multimodal Emotion Recognition Based on Facial Expressions, Speech, and EEG
    Pan, Jiahui
    Fang, Weijie
    Zhang, Zhihang
    Chen, Bingzhi
    Zhang, Zheng
    Wang, Shuihua
    IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY, 2024, 5 : 396 - 403
  • [7] Multimodal Emotion Recognition From EEG Signals and Facial Expressions
    Wang, Shuai
    Qu, Jingzi
    Zhang, Yong
    Zhang, Yidie
    IEEE ACCESS, 2023, 11 : 33061 - 33068
  • [8] Training Emotion Recognition Accuracy: Results for Multimodal Expressions and Facial Micro Expressions
    Dollinger, Lillian
    Laukka, Petri
    Hogman, Lennart Bjorn
    Banziger, Tanja
    Makower, Irena
    Fischer, Hakan
    Hau, Stephan
    FRONTIERS IN PSYCHOLOGY, 2021, 12
  • [9] Multimodal Emotion Recognition: Emotion Classification Through the Integration of EEG and Facial Expressions
    Guler, Songul Erdem
    Akbulut, Fatma Patlar
    IEEE ACCESS, 2025, 13 : 24587 - 24603
  • [10] Multimodal Emotion Recognition Based on Facial Expressions, Speech, and Body Gestures
    Yan, Jingjie
    Li, Peiyuan
    Du, Chengkun
    Zhu, Kang
    Zhou, Xiaoyang
    Liu, Ying
    Wei, Jinsheng
    ELECTRONICS, 2024, 13 (18)