A Hierarchical Encoding-Decoding Scheme for Abstractive Multi-document Summarization

被引:0
|
作者
Shen, Chenhui [1 ,2 ]
Cheng, Liying [1 ,3 ]
Xuan-Phi Nguyen [1 ,3 ]
You, Yang [2 ]
Bing, Lidong [1 ,3 ]
机构
[1] Alibaba Grp, DAMO Acad, Singapore, Singapore
[2] Natl Univ Singapore, Singapore, Singapore
[3] Hupan Lab, Hangzhou 310023, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pre-trained language models (PLMs) have achieved outstanding achievements in abstractive single-document summarization (SDS). However, such benefits may not fully extend to multi-document summarization (MDS), where the handling of cross-document information is more complex. Previous works either design new MDS architectures or apply PLMs bluntly with concatenated source documents as a reformulated SDS task. While the former does not utilize previous pre-training efforts and may not generalize well across different domains, the latter may not sufficiently attend to the intricate cross-document relationships unique to MDS tasks. Instead, we enforce hierarchy on both the encoder and decoder to better utilize a PLM to facilitate multi-document interactions for the MDS task. Across 10 MDS benchmarks from various domains, our method outperforms or is competitive with the previous best models, including those with additional MDS pre-training or with more parameters. It outperforms its corresponding PLM backbone by up to 3 ROUGEL and is favored by humans.(1)
引用
收藏
页码:5872 / 5887
页数:16
相关论文
共 50 条
  • [21] Abstractive document summarization via multi-template decoding
    Yuxin Huang
    Zhengtao Yu
    Junjun Guo
    Yan Xiang
    Zhiqiang Yu
    Yantuan Xian
    Applied Intelligence, 2022, 52 : 9650 - 9663
  • [22] Neural sentence fusion for diversity driven abstractive multi-document summarization
    Fuad, Tanvir Ahmed
    Nayeem, Mir Tafseer
    Mahmud, Asif
    Chali, Yllias
    COMPUTER SPEECH AND LANGUAGE, 2019, 58 : 216 - 230
  • [23] Diverse Decoding for Abstractive Document Summarization
    Han, Xu-Wang
    Zheng, Hai-Tao
    Chen, Jin-Yuan
    Zhao, Cong-Zhi
    APPLIED SCIENCES-BASEL, 2019, 9 (03):
  • [24] A framework for multi-document abstractive summarization based on semantic role labelling
    Khan, Atif
    Salim, Naomie
    Kumar, Yogan Jaya
    APPLIED SOFT COMPUTING, 2015, 30 : 737 - 747
  • [25] Clustered Genetic Semantic Graph Approach for Multi-document Abstractive Summarization
    Khan, Atif
    Salim, Naomie
    Farman, Haleem
    2016 INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS ENGINEERING (ICISE), 2016, : 63 - 70
  • [26] Data Augmentation for Abstractive Query-Focused Multi-Document Summarization
    Pasunuru, Ramakanth
    Celikyilmaz, Asli
    Galley, Michel
    Xiong, Chenyan
    Zhang, Yizhe
    Bansal, Mohit
    Gao, Jianfeng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 13666 - 13674
  • [27] Abstractive document summarization via multi-template decoding
    Huang, Yuxin
    Yu, Zhengtao
    Guo, Junjun
    Xiang, Yan
    Yu, Zhiqiang
    Xian, Yantuan
    APPLIED INTELLIGENCE, 2022, 52 (09) : 9650 - 9663
  • [28] A hierarchical framework for multi-document summarization of dissertation abstracts
    Khoo, CSG
    Ou, SY
    Goh, DHL
    DIGITAL LIBRARIES: PEOPLE, KNOWLEDGE, AND TECHNOLOGY, PROCEEDINGS, 2002, 2555 : 99 - 110
  • [29] Weighted hierarchical archetypal analysis for multi-document summarization
    Canhasi, Ercan
    Kononenko, Igor
    COMPUTER SPEECH AND LANGUAGE, 2016, 37 : 24 - 46
  • [30] Multi-Document Abstractive Summarization Using ILP Based Multi-Sentence Compression
    Banerjee, Siddhartha
    Mitra, Prasenjit
    Sugiyama, Kazunari
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 1208 - 1214