When audition alters vision: an event-related potential study of the cross-modal interactions between faces and voices

被引:69
|
作者
Joassin, F
Maurage, P
Bruyer, R
Crommelinck, M
Campanella, S
机构
[1] Catholic Univ Louvain, Fac Psychol & Sci Educ, Unite Neurosci Cognit, B-1348 Louvain, Belgium
[2] Catholic Univ Louvain, Fac Med, Unite Neurophysiol Clin, NEFY, Brussels, Belgium
关键词
cross-modal interactions; faces; voices; ERPs;
D O I
10.1016/j.neulet.2004.07.067
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Ten healthy volunteers took part in this event-related potential (ERP) study aimed at examining the electrophysiological correlates of the cross-modal audio-visual interactions in an identification task. Participants were confronted either to the simultaneous presentation of previously learned faces and voices (audio-visual condition; AV), either to the separate presentation of faces (visual, V) or voices (auditive, A). As expected, an interference effect of audition on vision was observed at a behavioral level, as the bimodal condition was performed more slowly than the visual condition. At the electrophysiological level, the subtraction (AV - (A + V)) gave prominence to three distinct cerebral activities: (1) a central positive/posterior negative wave around 110 ms, (2) a central negative/posterior positive wave around 170 ms, AND (3) a central positive wave around 270 ms. These data suggest that cross-modal cerebral interactions could be independent of behavioral facilitation or interference effects. Moreover, the implication of unimodal and multisensory convergence regions in these results, as suggested by a source localization analysis, is discussed. (C) 2004 Elsevier Ireland Ltd. All rights reserved.
引用
收藏
页码:132 / 137
页数:6
相关论文
共 50 条
  • [21] An event-related FMRI study of exogenous orienting across vision and audition
    Yang, Zhen
    Mayer, Andrew R.
    HUMAN BRAIN MAPPING, 2014, 35 (03) : 964 - 974
  • [22] Cross-modal links in exogenous covert spatial orienting between touch, audition, and vision
    Spence, C
    Nicholls, MER
    Gillespie, N
    Driver, J
    PERCEPTION & PSYCHOPHYSICS, 1998, 60 (04): : 544 - 557
  • [23] Modulation of somatosensory event-related potential components in a tactile-visual cross-modal task
    Ohara, S
    Lenz, FA
    Zhou, YD
    NEUROSCIENCE, 2006, 138 (04) : 1387 - 1395
  • [24] Cross-modal links in exogenous covert spatial orienting between touch, audition, and vision
    Charles Spence
    Michael E. R. Nicholls
    Nicole Gillespie
    Jon Driver
    Perception & Psychophysics, 1998, 60 : 544 - 557
  • [25] FMRI investigation of cross-modal interactions in beat perception: Audition primes vision, but not vice versa
    Grahn, Jessica A.
    Henry, Molly J.
    McAuley, J. Devin
    NEUROIMAGE, 2011, 54 (02) : 1231 - 1243
  • [26] Cross-modal processing of auditory-visual stimuli in a no-task paradigm:: A topographic event-related potential study
    Vidal, J.
    Giard, M. -H.
    Roux, S.
    Barthelemy, C.
    Bruneau, N.
    CLINICAL NEUROPHYSIOLOGY, 2008, 119 (04) : 763 - 771
  • [27] Altered semantic integration in autism beyond language: a cross-modal event-related potentials study
    Ribeiro, Tatiane C.
    Valasek, Claudia A.
    Minati, Ludovico
    Boggio, Paulo S.
    NEUROREPORT, 2013, 24 (08) : 414 - 418
  • [28] Cross-modal interactions between audition, touch, and vision in endogenous spatial attention: ERP evidence on preparatory states and sensory modulations
    Eimer, M
    van Velzen, J
    Driver, J
    JOURNAL OF COGNITIVE NEUROSCIENCE, 2002, 14 (02) : 254 - 271
  • [29] An event-related potential study of cross-modal translation recognition in Chinese-English bilinguals: the role of cross-linguistic orthography and phonology
    Zhang, Er-Hu
    Li, Jiaxin
    Li, Defeng
    Chen, Yiqiang
    Zhang, Xin-Dong
    Wang, Xinyi
    Cao, Hong-Wen
    LANGUAGE AND COGNITION, 2023, 15 (02) : 292 - 313
  • [30] Within- and cross-modal translation priming: An event-related potential investigation with Chinese-English bilinguals
    Zhang, Er-Hu
    Qin, Jing
    JOURNAL OF NEUROLINGUISTICS, 2025, 74