A Hierarchical Encoding-Decoding Scheme for Abstractive Multi-document Summarization

被引:0
|
作者
Shen, Chenhui [1 ,2 ]
Cheng, Liying [1 ,3 ]
Xuan-Phi Nguyen [1 ,3 ]
You, Yang [2 ]
Bing, Lidong [1 ,3 ]
机构
[1] Alibaba Grp, DAMO Acad, Singapore, Singapore
[2] Natl Univ Singapore, Singapore, Singapore
[3] Hupan Lab, Hangzhou 310023, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pre-trained language models (PLMs) have achieved outstanding achievements in abstractive single-document summarization (SDS). However, such benefits may not fully extend to multi-document summarization (MDS), where the handling of cross-document information is more complex. Previous works either design new MDS architectures or apply PLMs bluntly with concatenated source documents as a reformulated SDS task. While the former does not utilize previous pre-training efforts and may not generalize well across different domains, the latter may not sufficiently attend to the intricate cross-document relationships unique to MDS tasks. Instead, we enforce hierarchy on both the encoder and decoder to better utilize a PLM to facilitate multi-document interactions for the MDS task. Across 10 MDS benchmarks from various domains, our method outperforms or is competitive with the previous best models, including those with additional MDS pre-training or with more parameters. It outperforms its corresponding PLM backbone by up to 3 ROUGEL and is favored by humans.(1)
引用
收藏
页码:5872 / 5887
页数:16
相关论文
共 50 条
  • [1] Abstractive Multi-Document Summarization
    Ranjitha, N. S.
    Kallimani, Jagadish S.
    2017 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2017, : 1690 - 1693
  • [2] Disentangling Specificity for Abstractive Multi-document Summarization
    Ma, Congbo (congbo.ma@mq.edu.au), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [3] Document-aware Positional Encoding and Linguistic-guided Encoding for Abstractive Multi-document Summarization
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [4] Compressed Heterogeneous Graph for Abstractive Multi-Document Summarization
    Li, Miao
    Qi, Jianzhong
    Lau, Jey Han
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 11, 2023, : 13085 - 13093
  • [5] Entity-Aware Abstractive Multi-Document Summarization
    Zhou, Hao
    Ren, Weidong
    Liu, Gongshen
    Su, Bo
    Lu, Wei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 351 - 362
  • [6] Topic-Guided Abstractive Multi-Document Summarization
    Cui, Peng
    Hu, Le
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1463 - 1472
  • [7] Abstractive Multi-Document Summarization via Joint Learning with Single-Document Summarization
    Jin, Hanqi
    Wan, Xiaojun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 2545 - 2554
  • [8] Hierarchical Transformers for Multi-Document Summarization
    Liu, Yang
    Lapata, Mirella
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5070 - 5081
  • [9] Hierarchical Summarization: Scaling Up Multi-Document Summarization
    Christensen, Janara
    Soderland, Stephen
    Bansal, Gagan
    Mausam
    PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2014, : 902 - 912
  • [10] Abstractive Multi-document Summarization Using Deep Learning Approaches
    Poornima, Murkute
    Pulipati, Venkateswara Rao
    Kumar, T. Sunil
    PROCEEDINGS OF SECOND INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTER ENGINEERING AND COMMUNICATION SYSTEMS, ICACECS 2021, 2022, : 57 - 68