Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions

被引:6
|
作者
Treille, Avril [1 ,2 ]
Vilain, Coriandre [1 ,2 ]
Hueber, Thomas [1 ,2 ]
Lamalle, Laurent [3 ,4 ,5 ]
Sato, Marc [6 ,7 ]
机构
[1] CNRS, UMR 5216, Grenoble, France
[2] Grenoble Univ, Grenoble, France
[3] Univ Grenoble Alpes, Grenoble, France
[4] CHU Grenoble, Grenoble, France
[5] CNRS, UMS 3552, Grenoble, France
[6] CNRS, UMR 7309, Marseille, France
[7] Aix Marseille Univ, Marseille, France
基金
欧洲研究理事会;
关键词
PREMOTOR CORTEX; AUDIOVISUAL SPEECH; MIRROR NEURONS; ACTION RECOGNITION; AUDITORY-CORTEX; MOTOR SYSTEM; BROCAS AREA; PERCEPTION; FMRI; INTEGRATION;
D O I
10.1162/jocn_a_01057
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Action recognition has been found to rely not only on sensory brain areas but also partly on the observer's motor system. However, whether distinct auditory and visual experiences of an action modulate sensorimotor activity remains largely unknown. In the present sparse sampling fMRI study, we determined to which extent sensory and motor representations interact during the perception of tongue and lip speech actions. Tongue and lip speech actions were selected because tongue movements of our interlocutor are accessible via their impact on speech acoustics but not visible because of its position inside the vocal tract, whereas lip movements are both audible and visible. Participants were presented with auditory, visual, and audiovisual speech actions, with the visual inputs related to either a sagittal view of the tongue movements or a facial view of the lip movements of a speaker, previously recorded by an ultrasound imaging system and a video camera. Although the neural networks involved in visual visuolingual and visuofacial perception largely overlapped, stronger motor and somatosensory activations were observed during visuolingual perception. In contrast, stronger activity was found in auditory and visual cortices during visuofacial perception. Complementing these findings, activity in the left premotor cortex and in visual brain areas was found to correlate with visual recognition scores observed for visuolingual and visuofacial speech stimuli, respectively, whereas visual activity correlated with RTs for both stimuli. These results suggest that unimodal and multimodal processing of lip and tongue speech actions rely on common sensorimotor brain areas. They also suggest that visual processing of audible but not visible movements induces motor and visual mental simulation of the perceived actions to facilitate recognition and/or to learn the association between auditory and visual signals.
引用
收藏
页码:448 / 466
页数:19
相关论文
共 50 条
  • [1] ABSTRACT VERSUS MODALITY-SPECIFIC MEMORY REPRESENTATIONS IN PROCESSING AUDITORY AND VISUAL SPEECH
    DEGELDER, B
    VROOMEN, J
    MEMORY & COGNITION, 1992, 20 (05) : 533 - 538
  • [2] Multisensory and modality specific processing of visual speech in different regions of the premotor cortex
    Callan, Daniel E.
    Jones, Jeffery A.
    Callan, Akiko
    FRONTIERS IN PSYCHOLOGY, 2014, 5
  • [3] Subcortical, Modality-Specific Pathways Contribute to Multisensory Processing in Humans
    van den Brink, R. L.
    Cohen, M. X.
    van der Burg, E.
    Talsma, D.
    Vissers, M. E.
    Slagter, H. A.
    CEREBRAL CORTEX, 2014, 24 (08) : 2169 - 2177
  • [4] Perception of Incongruent Audiovisual Speech: Distribution of Modality-Specific Responses
    Sandhya
    Vinay
    Manchaiah, V
    AMERICAN JOURNAL OF AUDIOLOGY, 2021, 30 (04) : 968 - 979
  • [5] NON-MODALITY SPECIFIC SPEECH CODING - THE PROCESSING OF LIP-READ INFORMATION
    DODD, B
    CAMPBELL, R
    AUSTRALIAN JOURNAL OF PSYCHOLOGY, 1984, 36 (02) : 171 - 179
  • [6] Modality-specific selective attention attenuates multisensory integration
    Jennifer L. Mozolic
    Christina E. Hugenschmidt
    Ann M. Peiffer
    Paul J. Laurienti
    Experimental Brain Research, 2008, 184 : 39 - 52
  • [7] Modality-specific selective attention attenuates multisensory integration
    Mozolic, Jennifer L.
    Hugenschmidt, Christina E.
    Peiffer, Ann M.
    Laurienti, Paul J.
    EXPERIMENTAL BRAIN RESEARCH, 2008, 184 (01) : 39 - 52
  • [8] Modality-specific and multisensory mechanisms of spatial attention and expectation
    Zuanazzi, Arianna
    Noppeney, Uta
    JOURNAL OF VISION, 2020, 20 (08):
  • [9] Multisensory and Modality-Specific Influences on Adaptation to Optical Prisms
    Calzolari, Elena
    Albini, Federica
    Bolognini, Nadia
    Vallar, Giuseppe
    FRONTIERS IN HUMAN NEUROSCIENCE, 2017, 11
  • [10] Suppression of multisensory integration by modality-specific attention in aging
    Hugenschmidt, Christina E.
    Mozolic, Jennifer L.
    Laurienti, Paul J.
    NEUROREPORT, 2009, 20 (04) : 349 - 353