Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State

被引:0
|
作者
Futrell, Richard [1 ]
Wilcox, Ethan [2 ]
Morita, Takashi [3 ,4 ]
Qian, Peng [5 ]
Ballesteros, Miguel [6 ]
Levy, Roger [5 ]
机构
[1] Univ Calif Irvine, Dept Language Sci, Irvine, CA 92697 USA
[2] Harvard Univ, Dept Linguist, Cambridge, MA USA
[3] Kyoto Univ, Primate Res Inst, Kyoto, Japan
[4] MIT, Dept Linguist & Philosophy, Cambridge, MA USA
[5] MIT, Dept Brain & Cognit Sci, Cambridge, MA USA
[6] MIT, IBM Watson Lab, IBM Res, Cambridge, MA USA
来源
2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1 | 2019年
关键词
PREDICTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate the extent to which the behavior of neural network language models reflects incremental representations of syntactic state. To do so, we employ experimental methodologies which were originally developed in the field of psycholinguistics to study syntactic representation in the human mind. We examine neural network model behavior on sets of artificial sentences containing a variety of syntactically complex structures. These sentences not only test whether the networks have a representation of syntactic state, they also reveal the specific lexical cues that networks use to update these states. We test four models: two publicly available LSTM sequence models of English (Jozefowicz et al., 2016; Gulordava et al., 2018) trained on large datasets; an RNN Grammar (Dyer et al., 2016) trained on a small, parsed dataset; and an LSTM trained on the same small corpus as the RNNG. We find evidence for basic syntactic state representations in all models, but only the models trained on large datasets are sensitive to subtle lexical cues signalling changes in syntactic state.
引用
收藏
页码:32 / 42
页数:11
相关论文
共 50 条
  • [11] Finding Syntactic Representations in Neural Stacks
    Merrill, William
    Khazan, Lenny
    Amsel, Noah
    Hao, Yiding
    Mendelsohn, Simon
    Frank, Robert
    BLACKBOXNLP WORKSHOP ON ANALYZING AND INTERPRETING NEURAL NETWORKS FOR NLP AT ACL 2019, 2019, : 224 - 232
  • [12] A neural syntactic language model
    Emami, A
    Jelinek, F
    MACHINE LEARNING, 2005, 60 (1-3) : 195 - 227
  • [13] A Neural Syntactic Language Model
    Ahmad Emami
    Frederick Jelinek
    Machine Learning, 2005, 60 : 195 - 227
  • [14] Syntactic Dependency Representations in Neural Relation Classification
    Nooralahzadeh, Farhad
    Ovrelid, Lilja
    RELEVANCE OF LINGUISTIC STRUCTURE IN NEURAL ARCHITECTURES FOR NLP, 2018, : 47 - 53
  • [15] Psycholinguistic Diagnosis of Language Models' Commonsense Reasoning
    Cong, Yan
    PROCEEDINGS OF THE FIRST WORKSHOP ON COMMONSENSE REPRESENTATION AND REASONING (CSRR 2022), 2022, : 17 - 22
  • [16] Psycholinguistic models of oral verbal language production
    Scliar-Cabral, Leonor
    GRAGOATA-UFF, 2018, 23 (46): : 427 - 447
  • [17] Syntactic Priming in Language Acquisition: Representations, mechanisms and applications
    Pontikas, George
    FIRST LANGUAGE, 2023, 43 (04) : 461 - 463
  • [18] Syntactic priming in language acquisition: Representations, mechanisms and applications
    Wei, Ran
    CHILD LANGUAGE TEACHING & THERAPY, 2024, 40 (01): : 96 - 97
  • [19] Navigating with Graph Representations for Fast and Scalable Decoding of Neural Language Models
    Zhang, Minjia
    Liu, Xiaodong
    Wang, Wenhan
    Gao, Jianfeng
    He, Yuxiong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [20] Targeted Syntactic Evaluation of Language Models
    Marvin, Rebecca
    Linzen, Tal
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 1192 - 1202