Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain Responses

被引:0
|
作者
Antonello, Richard [1 ]
Turek, Javier [2 ]
Vy Vo [2 ]
Huth, Alexander [1 ]
机构
[1] UT Austin, Austin, TX 78712 USA
[2] Intel Labs, Hillsboro, OR USA
关键词
CORTICAL ORGANIZATION; HIERARCHY;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
How related are the representations learned by neural language models, translation models, and language tagging tasks? We answer this question by adapting an encoder-decoder transfer learning method from computer vision to investigate the structure among 100 different feature spaces extracted from hidden representations of various networks trained on language tasks. This method reveals a low-dimensional structure where language models and translation models smoothly interpolate between word embeddings, syntactic and semantic tasks, and future word embeddings. We call this low-dimensional structure a language representation embedding because it encodes the relationships between representations needed to process language for a variety of NLP (natural language processing) tasks. We find that this representation embedding can predict how well each individual feature space maps to human brain responses to natural language stimuli recorded using fMRI. Additionally, we find that the principal dimension of this structure can be used to create a metric which highlights the brain's natural language processing hierarchy. This suggests that the embedding captures some part of the brain's natural language representation structure.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Learning Low-Dimensional Temporal Representations
    Su, Bing
    Wu, Ying
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [2] Predictive learning as a network mechanism for extracting low-dimensional latent space representations
    Stefano Recanatesi
    Matthew Farrell
    Guillaume Lajoie
    Sophie Deneve
    Mattia Rigotti
    Eric Shea-Brown
    Nature Communications, 12
  • [3] Predictive learning as a network mechanism for extracting low-dimensional latent space representations
    Recanatesi, Stefano
    Farrell, Matthew
    Lajoie, Guillaume
    Deneve, Sophie
    Rigotti, Mattia
    Shea-Brown, Eric
    NATURE COMMUNICATIONS, 2021, 12 (01)
  • [4] Low-dimensional representations of special unitary groups
    Hiss, G
    Malle, G
    JOURNAL OF ALGEBRA, 2001, 236 (02) : 745 - 767
  • [5] Low-dimensional representations of finite orthogonal groups
    Magaard, Kay
    Malle, Gunter
    MATHEMATICAL PROCEEDINGS OF THE CAMBRIDGE PHILOSOPHICAL SOCIETY, 2021, 171 (03) : 585 - 606
  • [6] A Geometrical Method for Low-Dimensional Representations of Simulations
    Iza-Teran, Rodrigo
    Garcke, Jochen
    SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2019, 7 (02): : 472 - 496
  • [7] Incremental Construction of Low-Dimensional Data Representations
    Kuleshov, Alexander
    Bernstein, Alexander
    ARTIFICIAL NEURAL NETWORKS IN PATTERN RECOGNITION, 2016, 9896 : 55 - 67
  • [8] Low-dimensional controllability of brain networks
    Ben Messaoud, Remy
    Le Du, Vincent
    Bousfiha, Camile
    Corsi, Marie-Constance
    Gonzalez-Astudillo, Juliana
    Kaufmann, Brigitte Charlotte
    Venot, Tristan
    Couvy-Duchesne, Baptiste
    Migliaccio, Lara
    Rosso, Charlotte
    Bartolomeo, Paolo
    Chavez, Mario
    De Vico Fallani, Fabrizio
    PLOS COMPUTATIONAL BIOLOGY, 2025, 21 (01)
  • [9] Low-Dimensional Motor Control Representations in Throwing Motions
    Ruiz, Ana Lucia Cruz
    Pontonnier, Charles
    Dumont, Georges
    APPLIED BIONICS AND BIOMECHANICS, 2017, 2017
  • [10] Learning Low-Dimensional Temporal Representations with Latent Alignments
    Su, Bing
    Wu, Ying
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (11) : 2842 - 2857