A Combined Extractive With Abstractive Model for Summarization

被引:9
|
作者
Liu, Wenfeng [1 ]
Gao, Yaling [1 ]
Li, Jinming [1 ]
Yang, Yuzhen [1 ]
机构
[1] Heze Univ, Sch Comp, Heze 274015, Peoples R China
关键词
Syntactics; Feature extraction; Semantics; Reinforcement learning; Neural networks; Licenses; Deep learning; Extractive summarization; abstractive summarization; beam search; word embeddings;
D O I
10.1109/ACCESS.2021.3066484
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aiming at the difficulties in document-level summarization, this paper presents a two-stage, extractive and then abstractive summarization model. In the first stage, we extract the important sentences by combining sentences similarity matrix (only used for the first time) or pseudo-title, which takes full account of the features (such as sentence position, paragraph position, and more.). To extract coarse-grained sentences from a document, and considers the sentence differentiation for the most important sentences in the document. The second stage is abstractive, and we use beam search algorithm to restructure and rewrite these syntactic blocks of these extracted sentences. Newly generated summary sentence serves as the pseudo-summary of the next round. Globally optimal pseudo-title acts as the final summarization. Extensive experiments have been performed on the corresponding data set, and the results show our model can obtain better results.
引用
收藏
页码:43970 / 43980
页数:11
相关论文
共 50 条
  • [41] Source Identification in Abstractive Summarization
    Suhara, Yoshi
    Alikaniotis, Dimitris
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 212 - 224
  • [42] Global Encoding for Abstractive Summarization
    Lin, Junyang
    Sun, Xu
    Ma, Shuming
    Su, Qi
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 163 - 169
  • [43] Cross-modal knowledge guided model for abstractive summarization
    Hong Wang
    Jin Liu
    Mingyang Duan
    Peizhu Gong
    Zhongdai Wu
    Junxiang Wang
    Bing Han
    Complex & Intelligent Systems, 2024, 10 : 577 - 594
  • [44] Neural attention model with keyword memory for abstractive document summarization
    Choi, YunSeok
    Kim, Dahae
    Lee, Jee-Hyong
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2020, 32 (18):
  • [45] An approach to Abstractive Text Summarization
    Huong Thanh Le
    Tien Manh Le
    2013 INTERNATIONAL CONFERENCE OF SOFT COMPUTING AND PATTERN RECOGNITION (SOCPAR), 2013, : 371 - 376
  • [46] A Survey on Abstractive Text Summarization
    Moratanch, N.
    Chitrakala, S.
    PROCEEDINGS OF IEEE INTERNATIONAL CONFERENCE ON CIRCUIT, POWER AND COMPUTING TECHNOLOGIES (ICCPCT 2016), 2016,
  • [47] Abstractive text summarization for Hungarian
    Yang, Zijian Gyozo
    Agocs, Adam
    Kusper, Gabor
    Varadi, Tamas
    ANNALES MATHEMATICAE ET INFORMATICAE, 2021, 53 : 299 - 316
  • [48] On Faithfulness and Factuality in Abstractive Summarization
    Maynez, Joshua
    Narayan, Shashi
    Bohnet, Bernd
    McDonald, Ryan
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1906 - 1919
  • [49] A Survey on Abstractive Summarization Techniques
    Rachabathuni, Pavan Kartheek
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INVENTIVE COMPUTING AND INFORMATICS (ICICI 2017), 2017, : 762 - 765
  • [50] Extractive Summarization using A Latent Variable Model
    Celikyilmaz, Asli
    Hakkani-Tuer, Dilek
    11TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2010 (INTERSPEECH 2010), VOLS 3 AND 4, 2010, : 2530 - +