Transition-based Neural Constituent Parsing

被引:0
|
作者
Watanabe, Taro [1 ,2 ]
Sumita, Eiichiro
机构
[1] Natl Inst Informat & Commun Technol, 3-5 Hikaridai,Seika Cho, Kyoto 6190289, Japan
[2] Google, Tokyo, Japan
关键词
LANGUAGE; ONLINE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Constituent parsing is typically modeled by a chart-based algorithm under probabilistic context-free grammars or by a transition-based algorithm with rich features. Previous models rely heavily on richer syntactic information through lexicalizing rules, splitting categories, or memorizing long histories. However enriched models incur numerous parameters and sparsity issues, and are insufficient for capturing various syntactic phenomena. We propose a neural network structure that explicitly models the unbounded history of actions performed on the stack and queue employed in transition-based parsing, in addition to the representations of partially parsed tree structure. Our transition-based neural constituent parsing achieves performance comparable to the state-of-the-art parsers, demonstrating F1 score of 90.68% for English and 84.33% for Chinese, without reranking, feature templates or additional data to train model parameters.
引用
收藏
页码:1169 / 1179
页数:11
相关论文
共 50 条
  • [1] Improving Feature-Rich Transition-Based Constituent Parsing Using Recurrent Neural Networks
    Ma, Chunpeng
    Tamura, Akihiro
    Liu, Lemao
    Zhao, Tiejun
    Sumita, Eiichiro
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2017, E100D (09) : 2205 - 2214
  • [2] Structured Training for Neural Network Transition-Based Parsing
    Weiss, David
    Alberti, Chris
    Collins, Michael
    Petrov, Slav
    PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, 2015, : 323 - 333
  • [3] Structured predicton for transition-based constituent parsing: dense models and hollow models
    Coavoux, Maximin
    Crabbe, Benoit
    TRAITEMENT AUTOMATIQUE DES LANGUES, 2016, 57 (01): : 59 - 83
  • [4] A Neural Transition-Based Approach for Semantic Dependency Graph Parsing
    Wang, Yuxuan
    Che, Wanxiang
    Guo, Jiang
    Liu, Ting
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 5561 - 5568
  • [5] Bidirectional Transition-Based Dependency Parsing
    Yuan, Yunzhe
    Jiang, Yong
    Tu, Kewei
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 7434 - 7441
  • [6] Minimalist Grammar Transition-Based Parsing
    Stanojevic, Milos
    LOGICAL ASPECTS OF COMPUTATIONAL LINGUISTICS: CELEBRATING 20 YEARS OF LACL (1996-2016), 2016, 10054 : 273 - 290
  • [7] Joint POS Tagging and Dependence Parsing With Transition-Based Neural Networks
    Yang, Liner
    Zhang, Meishan
    Liu, Yang
    Sun, Maosong
    Yu, Nan
    Fu, Guohong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (08) : 1352 - 1358
  • [8] Joint POS Tagging and Transition-based Constituent Parsing in Chinese with Non-local Features
    Wang, Zhiguo
    Xue, Nianwen
    PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2014, : 733 - 742
  • [9] Integrating graph embedding and neural models for improving transition-based dependency parsing
    Le-Hong, Phuong
    Cambria, Erik
    NEURAL COMPUTING & APPLICATIONS, 2024, 36 (06): : 2999 - 3016
  • [10] A Neural Probabilistic Structured-Prediction Model for Transition-Based Dependency Parsing
    Zhou, Hao
    Zhang, Yue
    Huang, Shujian
    Chen, Jiajun
    PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, 2015, : 1213 - 1222