An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge

被引:0
|
作者
Li, Yuanyuan [1 ]
Huang, Yuan [1 ]
Huang, Weijian [1 ]
Yu, Junhao [1 ]
Huang, Zheng [1 ]
机构
[1] Hebei Univ Engn, Sch Informat & Elect Engn, Handan 056038, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 07期
关键词
abstractive summarization; joint-attention mechanism; prior knowledge; reinforcement learning;
D O I
10.3390/app13074610
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
An abstractive summarization model based on the joint-attention mechanism and a priori knowledge is proposed to address the problems of the inadequate semantic understanding of text and summaries that do not conform to human language habits in abstractive summary models. Word vectors that are most relevant to the original text should be selected first. Second, the original text is represented in two dimensions-word-level and sentence-level, as word vectors and sentence vectors, respectively. After this processing, there will be not only a relationship between word-level vectors but also a relationship between sentence-level vectors, and the decoder discriminates between word-level and sentence-level vectors based on their relationship with the hidden state of the decoder. Then, the pointer generation network is improved using a priori knowledge. Finally, reinforcement learning is used to improve the quality of the generated summaries. Experiments on two classical datasets, CNN/DailyMail and DUC 2004, show that the model has good performance and effectively improves the quality of generated summaries.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Neural Attention Model for Abstractive Text Summarization Using Linguistic Feature Space
    Dilawari, Aniqa
    Khan, Muhammad Usman Ghani
    Saleem, Summra
    Zahoor-Ur-Rehman
    Shaikh, Fatema Sabeen
    IEEE ACCESS, 2023, 11 : 23557 - 23564
  • [22] Multimodal Abstractive Summarization using bidirectional encoder representations from transformers with attention mechanism
    Argade, Dakshata
    Khairnar, Vaishali
    Vora, Deepali
    Patil, Shruti
    Kotecha, Ketan
    Alfarhood, Sultan
    HELIYON, 2024, 10 (04)
  • [23] KAAS: A Keyword-Aware Attention Abstractive Summarization Model for Scientific Articles
    Li, Shuaimin
    Xu, Jungang
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT III, 2022, : 263 - 271
  • [24] Proposal of Robot-Interaction Based Intervention for Joint-Attention Development
    Perez, Itsaso
    Rekalde, Itziar
    Ozaeta, Leire
    Grana, Manuel
    INTERNATIONAL JOINT CONFERENCE SOCO'18-CISIS'18- ICEUTE'18, 2019, 771 : 579 - 585
  • [25] Joint knowledge-powered topic level attention for a convolutional text summarization model
    Khanam, Shirin Akther
    Liu, Fei
    Chen, Yi-Ping Phoebe
    KNOWLEDGE-BASED SYSTEMS, 2021, 228
  • [26] HITS-based attentional neural model for abstractive summarization
    Cai, Xiaoyan
    Shi, Kaile
    Jiang, Yuehan
    Yang, Libin
    Liu, Sen
    KNOWLEDGE-BASED SYSTEMS, 2021, 222
  • [27] A unified framework for abstractive summarization over prompt language model and pointer mechanism
    Li, Ping
    Yu, Jiong
    Li, Min
    Chen, JiaYin
    Yang, DeXian
    He, ZhenZhen
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 43 (03) : 3323 - 3335
  • [28] Abstractive Text Summarization based on Language Model Conditioning and Locality Modeling
    Aksenov, Dmitrii
    Moreno-Schneider, Julian
    Bourgonje, Peter
    Schwarzenberg, Robert
    Hennig, Leonhard
    Rehm, Georg
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 6680 - 6689
  • [29] Abstractive Document Summarization with a Graph-Based Attentional Neural Model
    Tan, Jiwei
    Wan, Xiaojun
    Xiao, Jianguo
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1171 - 1181
  • [30] CapFormer: A Space-Time Video Description Model using Joint-Attention Transformer
    Moussa, Mahamat
    Lim, Chern Hong
    Wong, KokSheik
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 759 - 764