Transition-Based Mention Representation for Neural Coreference Resolution

被引:0
|
作者
Li, Qingqing [1 ,2 ]
Kong, Fang [1 ,2 ]
机构
[1] Soochow Univ, Lab Nat Language Proc, Suzhou, Peoples R China
[2] Soochow Univ, Sch Comp Sci & Technol, Suzhou, Peoples R China
关键词
Transition-based Approach; Neural Coreference Resolution; Nested Noun Phrases;
D O I
10.1007/978-981-99-4752-2_46
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Coreference resolution plays an important role in text understanding. Recently, various neural approaches have been proposed and achieved success. Although most researches agree that mention extraction and representation much impact the performance of coreference resolution, existing neural architectures consider all possible spans up to a maximum length and only employ a simple ffat word-level span representation. In this way, those information which has been proved to be effective in previous non-neural coreference resolution, such as structural information, has been largely ignored. In this paper, for coreference resolution, we propose a uniffed transition-based approach to extract and represent mentions simultaneously. In particular, we propose a Simplified Constituent Parse Tree (SCPT) scheme for each sentence by only keeping the local detail inside the mentions and the coarse frame structure outside the mentions. That is each mention corresponds to a constituent of the obtained SCPT. Then we employ a transition-based strategy to construct the SCPT structure in a bottom-up manner. In this way, various potential mentions (i.e., constituents) can be obtained and the corresponding transition action sequences embedded with internal structural information can be viewed as their proper representations. After that, we employ such achieved potential mentions and their transition-based representations for neural coreference resolution.
引用
收藏
页码:563 / 574
页数:12
相关论文
共 50 条
  • [1] Coreference Resolution with Mention Representation Using a Convolutional Neural Network
    Jeong, Seok-Won
    Kim, Sihyung
    Kim, Harksoo
    ADVANCED SCIENCE LETTERS, 2017, 23 (10) : 9534 - 9537
  • [2] A Neural Transition-based Model for Nested Mention Recognition
    Wang, Bailin
    Lu, Wei
    Wang, Yu
    Jin, Hongxia
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 1011 - 1017
  • [3] Coreference Resolution through a seq2seq Transition-Based System
    Bohnet, Bernd
    Alberti, Chris
    Collins, Michael
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2023, 11 : 212 - 226
  • [4] Neural Coreference Resolution with Deep Biaffine Attention by Joint Mention Detection and Mention Clustering
    Zhang, Rui
    dos Santos, Cicero Nogueira
    Yasunaga, Michihiro
    Xiang, Bing
    Radev, Dragomir R.
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 102 - 107
  • [5] Mention detection in coreference resolution: survey
    Lata, Kusum
    Singh, Pardeep
    Dutta, Kamlesh
    APPLIED INTELLIGENCE, 2022, 52 (09) : 9816 - 9860
  • [6] Mention detection in coreference resolution: survey
    Kusum Lata
    Pardeep Singh
    Kamlesh Dutta
    Applied Intelligence, 2022, 52 : 9816 - 9860
  • [7] Mention detection in Turkish coreference resolution
    Demir, Seniz
    Akdag, Hanifi Ibrahim
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2024, 32 (05) : 682 - 697
  • [8] Using Mention Accessibility to Improve Coreference Resolution
    Webster, Kellie
    Nothman, Joel
    PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2016), VOL 2, 2016, : 432 - 437
  • [9] Mention Detection Using Pointer Networks for Coreference Resolution
    Park, Cheoneum
    Lee, Changki
    Lim, Soojong
    ETRI JOURNAL, 2017, 39 (05) : 652 - 661
  • [10] Mention Clustering to Improve Portuguese Semantic Coreference Resolution
    Fonseca, Evandro
    Vanin, Aline
    Vieira, Renata
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2018), 2018, 10859 : 256 - 263