Emotion classification during music listening from forehead biosignals

被引:0
|
作者
Mohsen Naji
Mohammd Firoozabadi
Parviz Azadfallah
机构
[1] Islamic Azad University,Department of Biomedical Engineering, Science and Research Branch
[2] Tarbiat Modares University,Department of Medical Physics
[3] Tarbiat Modares University,Department of Psychology
来源
关键词
Forehead biosignals; Arousal; Valence; Emotion recognition;
D O I
暂无
中图分类号
学科分类号
摘要
Emotion recognition systems are helpful in human–machine interactions and clinical applications. This paper investigates the feasibility of using 3-channel forehead biosignals (left temporalis, frontalis, and right temporalis channel) as informative channels for emotion recognition during music listening. Classification of four emotional states (positive valence/low arousal, positive valence/high arousal, negative valence/high arousal, and negative valence/low arousal) in arousal–valence space was performed by employing two parallel cascade-forward neural networks as arousal and valence classifiers. The inputs of the classifiers were obtained by applying a fuzzy rough model feature evaluation criterion and sequential forward floating selection algorithm. An averaged classification accuracy of 87.05 % was achieved, corresponding to average valence classification accuracy of 93.66 % and average arousal classification accuracy of 93.29 %.
引用
收藏
页码:1365 / 1375
页数:10
相关论文
共 50 条
  • [1] Emotion classification during music listening from forehead biosignals
    Naji, Mohsen
    Firoozabadi, Mohammd
    Azadfallah, Parviz
    SIGNAL IMAGE AND VIDEO PROCESSING, 2015, 9 (06) : 1365 - 1375
  • [2] Emotion Classification Based on Forehead Biosignals using Support Vector Machines in Music Listening
    Naji, Mohsen
    Firoozabadi, Mohammad
    Azadfallah, Parviz
    IEEE 12TH INTERNATIONAL CONFERENCE ON BIOINFORMATICS & BIOENGINEERING, 2012, : 396 - 400
  • [3] Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram
    Mohsen Naji
    Mohammd Firoozabadi
    Parviz Azadfallah
    Cognitive Computation, 2014, 6 : 241 - 252
  • [4] Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram
    Naji, Mohsen
    Firoozabadi, Mohammd
    Azadfallah, Parviz
    COGNITIVE COMPUTATION, 2014, 6 (02) : 241 - 252
  • [5] Comparison of EEG feature vector for emotion classification according to music listening
    Lee, S.-P. (esprit@smu.ac.kr), 1600, Korean Institute of Electrical Engineers (63):
  • [6] Children Listening to Music During Physical Activity, and Their Emotion Regulation
    Treisman, Pamela
    Snethen, Julia
    Treisman, Ruth
    Tsai, Pei-Yun
    Feay-Shaw, Sheila
    Thongpriwan, Vipavee
    Gwon, Joshua
    WESTERN JOURNAL OF NURSING RESEARCH, 2024, 46 (01) : 18S - 18S
  • [7] Alteration of gait characteristics during music listening: The role of emotion and rythm
    Park, Kyoung Shin
    Lee, Hyokeun
    Fawver, Bradley
    Hass, Chris J.
    Janelle, Christopher M.
    JOURNAL OF SPORT & EXERCISE PSYCHOLOGY, 2017, 39 : S165 - S165
  • [8] Positive emotion learning through music listening
    Milicevic, A
    ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE, 2002, : 125 - 129
  • [9] Music Listening, Emotion, and Cognition in Older Adults
    Vincenzi, Margherita
    Borella, Erika
    Sella, Enrico
    Lima, Cesar F.
    De Beni, Rossana
    Schellenberg, E. Glenn
    BRAIN SCIENCES, 2022, 12 (11)
  • [10] Multilayer perceptron for EEG signal classification during listening to emotional music
    Lin, Yuan-Pin
    Wang, Chi-Hong
    Wu, Tien-Lin
    Jeng, Shyh-Kang
    Chen, Jyh-Horng
    TENCON 2007 - 2007 IEEE REGION 10 CONFERENCE, VOLS 1-3, 2007, : 236 - +