Self-attentive Biaffine Dependency Parsing

被引:0
|
作者
Li, Ying [1 ]
Li, Zhenghua [1 ]
Zhang, Min [1 ]
Wang, Rui [2 ]
Li, Sheng [1 ]
Si, Luo [2 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Inst Artificial Intelligence, Suzhou, Peoples R China
[2] Alibaba Grp, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The current state-of-the-art dependency parsing approaches employ BiLSTMs to encode input sentences. Motivated by the success of the transformer-based machine translation, this work for the first time applies the self-attention mechanism to dependency parsing as the replacement of BiLSTM, leading to competitive performance on both English and Chinese benchmark data. Based on detailed error analysis, we then combine the power of both BiLSTM and self-attention via model ensembles, demonstrating their complementary capability of capturing contextual information. Finally, we explore the recently proposed contextualized word representations as extra input features, and further improve the parsing performance.
引用
收藏
页码:5067 / 5073
页数:7
相关论文
共 50 条
  • [21] SALADNET: SELF-ATTENTIVE MULTISOURCE LOCALIZATION IN THE AMBISONICS DOMAIN
    Grumiaux, Pierre-Amaury
    Kitic, Srdan
    Srivastava, Prerak
    Girin, Laurent
    Guerin, Alexandre
    2021 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS (WASPAA), 2021, : 336 - 340
  • [22] Self-Attentive Moving Average for Time Series Prediction
    Su, Yaxi
    Cui, Chaoran
    Qu, Hao
    APPLIED SCIENCES-BASEL, 2022, 12 (07):
  • [23] Graph convolutional network and self-attentive for sequential recommendation
    Guo, Kaifeng
    Zeng, Guolei
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [24] Deep Fourier Kernel for Self-Attentive Point Processes
    Zhu, Shixiang
    Zhang, Minghe
    Ding, Ruyi
    Xie, Yao
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [25] Self-attentive Rationalization for Interpretable Graph Contrastive Learning
    Li, Sihang
    Luo, Yanchen
    Zhang, An
    Wang, Xiang
    Li, Longfei
    Zhou, Jun
    Chua, Tat-seng
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2025, 19 (02)
  • [26] Self-Attentive Similarity Measurement Strategies in Speaker Diarization
    Lin, Qingjian
    Hou, Yu
    Li, Ming
    INTERSPEECH 2020, 2020, : 284 - 288
  • [27] Explicit Sparse Self-Attentive Network for CTR Prediction
    Luo, Yu
    Peng, Wanwan
    Fan, Youping
    Pang, Hong
    Xu, Xiang
    Wu, Xiaohua
    PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY, 2021, 183 : 690 - 695
  • [28] Improving Disfluency Detection by Self-Training a Self-Attentive Model
    Lou, Paria Jamshid
    Johnson, Mark
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3754 - 3763
  • [29] A self-attentive model for tracing knowledge and engagement in parallel
    Jiang, Hua
    Xiao, Bing
    Luo, Yintao
    Ma, Junliang
    PATTERN RECOGNITION LETTERS, 2023, 165 : 25 - 32
  • [30] Locker: Locally Constrained Self-Attentive Sequential Recommendation
    He, Zhankui
    Zhao, Handong
    Wang, Zhaowen
    Lin, Zhe
    Kale, Ajinkya
    McAuley, Julian
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3088 - 3092