Applying Siamese Hierarchical Attention Neural Networks for multi-document summarization

被引:0
|
作者
Angel Gonzalez, Jose [1 ]
Delonca, Julien [1 ]
Sanchis, Emilio [1 ]
Garcia-Granada, Fernando [1 ]
Segarra, Encarna [1 ]
机构
[1] Univ Politecn Valencia, VRAIN Valencian Res Inst Artificial Intelligence, Camino de Vera S-N, E-46022 Valencia, Spain
来源
关键词
Siamese Hierarchical Attention Neural Networks; multi-document summarization;
D O I
10.26342/2019-63-12
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present an approach to multi-document summarization based on Siamese Hierarchical Attention Neural Networks. The attention mechanism of Hierarchical Attention Networks, provides a score to each sentence in function of its relevance in the classification process. For the summarization process, only the scores of sentences are used to rank them and select the most salient sentences. In this work we explore the adaptability of this model to the problem of multi-document summarization (typically very long documents where the straightforward application of neural networks tends to fail). The experiments were carried out using the CNN/DailyMail as training corpus, and the DUC-2007 as test corpus. Despite the difference between training set (CNN/DailyMail) and test set (DUC-2007) characteristics, the results show the adequacy of this approach to multi-document summarization.
引用
收藏
页码:111 / 118
页数:8
相关论文
共 50 条
  • [1] Hierarchical Transformers for Multi-Document Summarization
    Liu, Yang
    Lapata, Mirella
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5070 - 5081
  • [2] Boosting multi-document summarization with hierarchical graph convolutional networks
    Song, Yingjie
    Yang, Li
    Luo, Wenming
    Xiao, Xiong
    Tang, Zhuo
    NEUROCOMPUTING, 2025, 614
  • [3] Hierarchical Summarization: Scaling Up Multi-Document Summarization
    Christensen, Janara
    Soderland, Stephen
    Bansal, Gagan
    Mausam
    PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2014, : 902 - 912
  • [4] A Hybrid Hierarchical Model for Multi-Document Summarization
    Celikyilmaz, Asli
    Hakkani-Tur, Dilek
    ACL 2010: 48TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2010, : 815 - 824
  • [5] Siamese hierarchical attention networks for extractive summarization
    Gonzalez, Jose-Angel
    Segarra, Encarna
    Garcia-Granada, Fernando
    Sanchis, Emilio
    Hurtado, Lluis-F
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2019, 36 (05) : 4599 - 4607
  • [6] Ranking with Recursive Neural Networks and Its Application to Multi-Document Summarization
    Cao, Ziqiang
    Wei, Furu
    Dong, Li
    Li, Sujian
    Zhou, Ming
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2153 - 2159
  • [7] A hierarchical framework for multi-document summarization of dissertation abstracts
    Khoo, CSG
    Ou, SY
    Goh, DHL
    DIGITAL LIBRARIES: PEOPLE, KNOWLEDGE, AND TECHNOLOGY, PROCEEDINGS, 2002, 2555 : 99 - 110
  • [8] Multi-document summarization with determinantal point process attention
    Perez-Beltrachini L.
    Lapata M.
    1600, AI Access Foundation (71): : 371 - 399
  • [9] Weighted hierarchical archetypal analysis for multi-document summarization
    Canhasi, Ercan
    Kononenko, Igor
    COMPUTER SPEECH AND LANGUAGE, 2016, 37 : 24 - 46
  • [10] Siamese Network-Based Prioritization for Enhanced Multi-document Summarization
    Garcia, Klaifer
    Berton, Lilian
    INTELLIGENT SYSTEMS, BRACIS 2024, PT II, 2025, 15413 : 400 - 415