On the role of crossmodal prediction in audiovisual emotion perception

被引:36
|
作者
Jessen, Sarah [1 ]
Kotz, Sonja A. [2 ,3 ]
机构
[1] Max Planck Inst Human Cognit & Brain Sci, Res Grp Early Social Dev, D-04103 Leipzig, Germany
[2] Max Planck Inst Human Cognit & Brain Sci, Dept Neuropsychol, Res Grp Subcort Contribut Comprehens, D-04103 Leipzig, Germany
[3] Univ Manchester, Sch Psychol Sci, Manchester, Lancs, England
来源
关键词
cross-modal prediction; emotion; multisensory; EEG; audiovisual; NEURONAL OSCILLATIONS; FACIAL EXPRESSIONS; BRAIN POTENTIALS; NEURAL PROCESSES; AUDITORY-CORTEX; VISUAL SPEECH; TIME-COURSE; INTEGRATION; BINDING; HUMANS;
D O I
10.3389/fnhum.2013.00369
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others' emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of cross-modal prediction. In emotion perception, as in most other settings, visual information precedes the auditory information. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, so far it has not been addressed in audiovisual emotion perception. Based on the current state of the art in (a) cross-modal prediction and (b) multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG) and magnetoencephalographic (MEG) studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow more reliable predicting of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 EEG response and the duration of visual emotional, but not non-emotional information. If the assumption that emotional content allows more reliable predicting can be corroborated in future studies, cross-modal prediction is a crucial factor in our understanding of multisensory emotion perception.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Crossmodal interactions in audiovisual emotion processing
    Mueller, Veronika I.
    Cieslik, Edna C.
    Turetsky, Bruce I.
    Eickhoff, Simon B.
    NEUROIMAGE, 2012, 60 (01) : 553 - 561
  • [2] Crossmodal emotion perception
    Tanaka A.
    Kyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers, 2018, 72 (01): : 12 - 16
  • [3] Crossmodal and incremental perception of audiovisual cues to emotional speech
    Barkhuysen, Pashiera
    Krahmer, Emiel
    Swerts, Marc
    LANGUAGE AND SPEECH, 2010, 53 : 3 - 30
  • [4] Audiovisual Interactions in Emotion Perception for Communication
    de Boer, Minke
    Baskent, Deniz
    Cornelissen, Frans W.
    PERCEPTION, 2019, 48 : 109 - 109
  • [5] Unity Assumption in Audiovisual Emotion Perception
    Sou, Ka Lon
    Say, Ashley
    Xu, Hong
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [6] Audiovisual emotion perception develops differently from audiovisual phoneme perception during childhood
    Yamamoto, Hisako W.
    Kawahara, Misako
    Tanaka, Akihiro
    PLOS ONE, 2020, 15 (06):
  • [7] A Crossmodal Role for Audition in Taste Perception
    Yan, Kimberly S.
    Dando, Robin
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 2015, 41 (03) : 590 - 596
  • [8] Crossmodal benefits to vocal emotion perception in cochlear implant users
    von Eiff, Celina Isabelle
    Fruehholz, Sascha
    Korth, Daniela
    Guntinas-Lichius, Orlando
    Schweinberger, Stefan Robert
    ISCIENCE, 2022, 25 (12)
  • [9] Prediction and constraint in audiovisual speech perception
    Peelle, Jonathan E.
    Sommers, Mitchell S.
    CORTEX, 2015, 68 : 169 - 181
  • [10] The role of crossmodal processing in intersubjective pain perception
    Morrison, I
    JOURNAL OF PSYCHOPHYSIOLOGY, 2005, 19 (01) : 61 - 61