Electrophysiological Indices of Audiovisual Speech Perception in the Broader Autism Phenotype

被引:10
|
作者
Irwin, Julia [1 ,2 ]
Avery, Trey [1 ]
Turcios, Jacqueline [1 ,3 ]
Brancazio, Lawrence [1 ,2 ]
Cook, Barbara [3 ]
Landi, Nicole [1 ,4 ]
机构
[1] Haskins Labs Inc, New Haven, CT 06511 USA
[2] Southern Connecticut State Univ, Dept Psychol, New Haven, CT 06515 USA
[3] Southern Connecticut State Univ, Dept Commun Disorders, New Haven, CT 06515 USA
[4] Univ Connecticut, Psychol Sci, Storrs, CT 06269 USA
关键词
audiovisual speech perception; development; broader autism phenotype; ERP; SPECTRUM DISORDER; CHILDREN; INFANTS; FACE;
D O I
10.3390/brainsci7060060
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
When a speaker talks, the consequences of this can both be heard (audio) and seen (visual). A novel visual phonemic restoration task was used to assess behavioral discrimination and neural signatures (event-related potentials, or ERP) of audiovisual processing in typically developing children with a range of social and communicative skills assessed using the social responsiveness scale, a measure of traits associated with autism. An auditory oddball design presented two types of stimuli to the listener, a clear exemplar of an auditory consonant-vowel syllable /ba/ (the more frequently occurring standard stimulus), and a syllable in which the auditory cues for the consonant were substantially weakened, creating a stimulus which is more like /a/ (the infrequently presented deviant stimulus). All speech tokens were paired with a face producing /ba/ or a face with a pixelated mouth containing motion but no visual speech. In this paradigm, the visual /ba/ should cause the auditory /a/ to be perceived as /ba/, creating an attenuated oddball response; in contrast, a pixelated video (without articulatory information) should not have this effect. Behaviorally, participants showed visual phonemic restoration (reduced accuracy in detecting deviant /a/) in the presence of a speaking face. In addition, ERPs were observed in both an early time window (N100) and a later time window (P300) that were sensitive to speech context (/ba/ or /a/) and modulated by face context (speaking face with visible articulation or with pixelated mouth). Specifically, the oddball responses for the N100 and P300 were attenuated in the presence of a face producing /ba/ relative to a pixelated face, representing a possible neural correlate of the phonemic restoration effect. Notably, those individuals with more traits associated with autism (yet still in the non-clinical range) had smaller P300 responses overall, regardless of face context, suggesting generally reduced phonemic discrimination.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Electrophysiological Indices of Audiovisual Speech Perception: Beyond the McGurk Effect and Speech in Noise
    Irwin, Julia
    Avery, Trey
    Brancazio, Lawrence
    Turcios, Jacqueline
    Ryherd, Kayleigh
    Landi, Nicole
    MULTISENSORY RESEARCH, 2018, 31 (1-2) : 39 - 56
  • [2] The Broader Autism Phenotype and Visual Perception in Children
    Antoinette Sabatino DiCriscio
    Vanessa Troiani
    Journal of Autism and Developmental Disorders, 2018, 48 : 2809 - 2820
  • [3] The Broader Autism Phenotype and Visual Perception in Children
    DiCriscio, Antoinette Sabatino
    Troiani, Vanessa
    JOURNAL OF AUTISM AND DEVELOPMENTAL DISORDERS, 2018, 48 (08) : 2809 - 2820
  • [4] Electrophysiological indices of audiovisual speech perception: Beyond the McGurk effect and speech in noise (vol 31, pg 39, 2018)
    Irwin, Julia
    Avery, Trey
    Brancazio, Lawrence
    Turcios, Jacqueline
    Ryherd, Kayleigh
    Landi, Nicole
    MULTISENSORY RESEARCH, 2020, 33 (06) : 701 - 702
  • [5] Autism, regression, and the broader autism phenotype
    Lainhart, JE
    Ozonoff, S
    Coon, H
    Krasny, L
    Dinh, E
    Nice, J
    McMahon, W
    AMERICAN JOURNAL OF MEDICAL GENETICS, 2002, 113 (03): : 231 - 237
  • [6] Stratifying the autistic phenotype using electrophysiological indices of social perception
    Mason, Luke
    Moessnang, Carolin
    Chatham, Christopher
    Ham, Lindsay
    Tillmann, Julian
    Dumas, Guillaume
    Ellis, Claire
    Leblond, Claire S.
    Cliquet, Freddy
    Bourgeron, Thomas
    Beckmann, Christian
    Charman, Tony
    Oakley, Beth
    Banaschewski, Tobias
    Meyer-Lindenberg, Andreas
    Baron-Cohen, Simon
    Bolte, Sven
    Buitelaar, Jan K.
    Durston, Sarah
    Loth, Eva
    Oranje, Bob
    Persico, Antonio
    Dell'Acqua, Flavio
    Ecker, Christine
    Johnson, Mark H.
    Murphy, Declan
    Jones, Emily J. H.
    SCIENCE TRANSLATIONAL MEDICINE, 2022, 14 (658)
  • [7] Audiovisual speech perception
    Sams, M.
    PERCEPTION, 1997, 26 : 78 - 78
  • [8] Development of an audiovisual speech perception app for children with autism spectrum disorders
    Irwin, Julia
    Preston, Jonathan
    Brancazio, Lawrence
    D'Angelo, Michael
    Turcios, Jacqueline
    CLINICAL LINGUISTICS & PHONETICS, 2015, 29 (01) : 76 - 83
  • [9] The broader autism phenotype constellations-disability matrix paradigm: Theoretical model for autism and the broader autism phenotype
    McDonald, T. A. Meridian
    MEDICAL HYPOTHESES, 2021, 146
  • [10] Electrophysiological evidence for differences between fusion and combination illusions in audiovisual speech perception
    Baart, Martijn
    Lindborg, Alma
    Andersen, Tobias S.
    EUROPEAN JOURNAL OF NEUROSCIENCE, 2017, 46 (10) : 2578 - 2583