Audiovisual integration in the human perception of materials

被引:38
|
作者
Fujisaki, Waka [1 ]
Goda, Naokazu [2 ]
Motoyoshi, Isamu [3 ]
Komatsu, Hidehiko [2 ]
Nishida, Shin'ya [3 ]
机构
[1] Natl Inst Adv Ind Sci & Technol, Human Technol Res Inst, Tsukuba, Ibaraki, Japan
[2] Natl Inst Physiol Sci, Div Sensory & Cognit Informat, Okazaki, Aichi 444, Japan
[3] NTT Corp, NTT Commun Sci Labs, Atsugi, Kanagawa, Japan
来源
JOURNAL OF VISION | 2014年 / 14卷 / 04期
关键词
material perception; audio-visual integration; Bayesian integration; surface texture; impact sound; VISUAL-PERCEPTION; SURFACE; REPRESENTATION; TEXTURE; COLOR; FMRI; FORM; INFORMATION; GLOSS;
D O I
10.1167/14.4.12
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Interest in the perception of the material of objects has been growing. While material perception is a critical ability for animals to properly regulate behavioral interactions with surrounding objects (e.g., eating), little is known about its underlying processing. Vision and audition provide useful information for material perception; using only its visual appearance or impact sound, we can infer what an object is made from. However, what material is perceived when the visual appearance of one material is combined with the impact sound of another, and what are the rules that govern cross-modal integration of material information? We addressed these questions by asking 16 human participants to rate how likely it was that audiovisual stimuli (48 combinations of visual appearances of six materials and impact sounds of eight materials) along with visual-only stimuli and auditory-only stimuli fell into each of 13 material categories. The results indicated strong interactions between audiovisual material perceptions; for example, the appearance of glass paired with a pepper sound is perceived as transparent plastic. Rating materialcategory likelihoods follow amultiplicative integration rule in that the categories judged to be likely are consistent with both visual and auditory stimuli. On the other hand, rating-material properties, such as roughness and hardness, follow a weighted average rule. Despite a difference in their integration calculations, both rules can be interpreted as optimal Bayesian integration of independent audiovisual estimations for the two types of material judgment, respectively.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Optimal audiovisual integration of object appearance and impact sounds in human perception of materials
    Fujisaki, Waka
    Goda, Naokazu
    Motoyoshi, Isamu
    Komatsu, Hidehiko
    Nishida, Shin'ya
    I-PERCEPTION, 2014, 5 (04): : 238 - 238
  • [2] Not glass but plastic-Audiovisual integration in human material perception
    Nishida, S.
    Fujisaki, W.
    Goda, N.
    Motoyoshi, I.
    Komatsu, H.
    PERCEPTION, 2012, 41 : 49 - 49
  • [3] AUDIOVISUAL INTEGRATION IN PERCEPTION OF REAL WORDS
    DEKLE, DJ
    FOWLER, CA
    FUNNELL, MG
    PERCEPTION & PSYCHOPHYSICS, 1992, 51 (04): : 355 - 362
  • [4] Automatic audiovisual integration in speech perception
    Gentilucci, M
    Cattaneo, L
    EXPERIMENTAL BRAIN RESEARCH, 2005, 167 (01) : 66 - 75
  • [5] The Role of Audiovisual Integration in the Perception of Attractiveness
    Mook, Alexis T.
    Mitchel, Aaron D.
    EVOLUTIONARY BEHAVIORAL SCIENCES, 2019, 13 (01) : 1 - 15
  • [6] Automatic audiovisual integration in speech perception
    Maurizio Gentilucci
    Luigi Cattaneo
    Experimental Brain Research, 2005, 167 : 66 - 75
  • [7] Dissociation of perception and action in audiovisual multisensory integration
    Leone, Lynnette M.
    McCourt, Mark E.
    EUROPEAN JOURNAL OF NEUROSCIENCE, 2015, 42 (11) : 2915 - 2922
  • [8] Reassessing the Benefits of Audiovisual Integration to Speech Perception and Intelligibility
    O'Hanlon, Brandon
    Plack, Christopher J.
    Nuttall, Helen E.
    JOURNAL OF SPEECH LANGUAGE AND HEARING RESEARCH, 2025, 68 (01): : 26 - 39
  • [9] Perception based method for the investigation of audiovisual integration of speech
    Huhn, Zsofia
    Szirtes, Gabor
    Lorincz, Andras
    Csepe, Valeria
    NEUROSCIENCE LETTERS, 2009, 465 (03) : 204 - 209
  • [10] Modeling the Development of Audiovisual Cue Integration in Speech Perception
    Getz, Laura M.
    Nordeen, Elke R.
    Vrabic, Sarah C.
    Toscano, Joseph C.
    BRAIN SCIENCES, 2017, 7 (03):