Visual localization ability influences cross-modal bias

被引:117
|
作者
Hairston, WD [1 ]
Wallace, MT [1 ]
Vaughan, JW [1 ]
Stein, BE [1 ]
Norris, JL [1 ]
Schirillo, JA [1 ]
机构
[1] Wake Forest Univ, Bowman Gray Sch Med, Dept Neurobiol & Anat, Winston Salem, NC 27157 USA
关键词
D O I
10.1162/089892903321107792
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
The ability of a visual signal to influence the localization of an auditory target (i.e., "cross-modal bias") was examined as a function of the spatial disparity between the two stimuli and their absolute locations in space. Three experimental issues were examined: (a) the effect of a spatially disparate visual stimulus on auditory localization judgments; (b) how the ability to localize visual, auditory, and spatially aligned multisensory (visual-auditory) targets is related to cross-modal bias, and (c) the relationship between the magnitude of cross-modal bias and the perception that the two stimuli are spatially "unified" (i.e., originate from the same location). Whereas variability in localization of auditory targets was large and fairly uniform for all tested locations, variability in localizing visual or spatially aligned multisensory targets was much smaller, and increased with increasing distance from the midline. This trend proved to be strongly correlated with biasing effectiveness, for although visual-auditory bias was unexpectedly large in all conditions tested, it decreased progressively (as localization variability increased) with increasing distance from the midline. Thus, central visual stimuli had a substantially greater biasing effect on auditory target localization than did more peripheral visual stimuli. It was also apparent that cross-modal bias decreased as the degree of visual-auditory disparity increased. Consequently, the greatest visual-auditory biases were obtained with small disparities at central locations. In all cases, the magnitude of these biases covaried with judgments of spatial unity. The results suggest that functional properties of the visual system play the predominant role in determining these visual-auditory interactions and that cross-modal biases can be substantially greater than previously noted.
引用
收藏
页码:20 / 29
页数:10
相关论文
共 50 条
  • [41] The role of visual experience in the emergence of cross-modal correspondences
    Hamilton-Fletcher, Giles
    Pisanski, Katarzyna
    Reby, David
    Stefaticzyk, Michal
    Ward, Jamie
    Sorokowska, Agnieszka
    COGNITION, 2018, 175 : 114 - 121
  • [42] Active Visual-Tactile Cross-Modal Matching
    Liu, Huaping
    Wang, Feng
    Sun, Fuchun
    Zhang, Xinyu
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2019, 11 (02) : 176 - 187
  • [43] Cross-modal integration of simple auditory and visual events
    Geoffrey R. Patching
    Philip T. Quinlan
    Perception & Psychophysics, 2004, 66 : 131 - 140
  • [45] Visual and tactile cross-modal mere exposure effects
    Suzuki, Miho
    Gyoba, Jiro
    COGNITION & EMOTION, 2008, 22 (01) : 147 - 154
  • [46] Deep Cross-Modal Audio-Visual Generation
    Chen, Lele
    Srivastava, Sudhanshu
    Duan, Zhiyao
    Xu, Chenliang
    PROCEEDINGS OF THE THEMATIC WORKSHOPS OF ACM MULTIMEDIA 2017 (THEMATIC WORKSHOPS'17), 2017, : 349 - 357
  • [47] Cross-modal processing in auditory and visual working memory
    Suchan, B
    Linnewerth, B
    Köster, O
    Daum, I
    Schmid, G
    NEUROIMAGE, 2006, 29 (03) : 853 - 858
  • [48] Cross-modal transfer in visual and haptic object categorization
    Gaissert, N.
    Waterkamp, S.
    Van Dam, L.
    Buelthoff, I.
    PERCEPTION, 2011, 40 : 134 - 134
  • [49] Cross-modal integration of simple auditory and visual events
    Patching, GR
    Quinlan, PT
    PERCEPTION & PSYCHOPHYSICS, 2004, 66 (01): : 131 - 140
  • [50] Cross-modal transfer in visual and nonvisual cues in bumblebees
    Michael J. M. Harrap
    David A. Lawson
    Heather M. Whitney
    Sean A. Rands
    Journal of Comparative Physiology A, 2019, 205 : 427 - 437