An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge

被引:0
|
作者
Li, Yuanyuan [1 ]
Huang, Yuan [1 ]
Huang, Weijian [1 ]
Yu, Junhao [1 ]
Huang, Zheng [1 ]
机构
[1] Hebei Univ Engn, Sch Informat & Elect Engn, Handan 056038, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 07期
关键词
abstractive summarization; joint-attention mechanism; prior knowledge; reinforcement learning;
D O I
10.3390/app13074610
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
An abstractive summarization model based on the joint-attention mechanism and a priori knowledge is proposed to address the problems of the inadequate semantic understanding of text and summaries that do not conform to human language habits in abstractive summary models. Word vectors that are most relevant to the original text should be selected first. Second, the original text is represented in two dimensions-word-level and sentence-level, as word vectors and sentence vectors, respectively. After this processing, there will be not only a relationship between word-level vectors but also a relationship between sentence-level vectors, and the decoder discriminates between word-level and sentence-level vectors based on their relationship with the hidden state of the decoder. Then, the pointer generation network is improved using a priori knowledge. Finally, reinforcement learning is used to improve the quality of the generated summaries. Experiments on two classical datasets, CNN/DailyMail and DUC 2004, show that the model has good performance and effectively improves the quality of generated summaries.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] Neural Related Work Summarization with a Joint Context-driven Attention Mechanism
    Wang, Yongzhen
    Liu, Xiaozhong
    Gao, Zheng
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 1776 - 1786
  • [42] Improving the readability and saliency of abstractive text summarization using combination of deep neural networks equipped with auxiliary attention mechanism
    Aliakbarpour, Hassan
    Manzuri, Mohammad Taghi
    Rahmani, Amir Masoud
    JOURNAL OF SUPERCOMPUTING, 2022, 78 (02): : 2528 - 2555
  • [43] Improving the readability and saliency of abstractive text summarization using combination of deep neural networks equipped with auxiliary attention mechanism
    Hassan Aliakbarpour
    Mohammad Taghi Manzuri
    Amir Masoud Rahmani
    The Journal of Supercomputing, 2022, 78 : 2528 - 2555
  • [44] An unsupervised opinion summarization model fused joint attention and dictionary learning
    Xiong, Yu
    Yan, Minghe
    Hu, Xiang
    Ren, Chaohui
    Tian, Hang
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (16): : 17759 - 17783
  • [45] An unsupervised opinion summarization model fused joint attention and dictionary learning
    Yu Xiong
    Minghe Yan
    Xiang Hu
    Chaohui Ren
    Hang Tian
    The Journal of Supercomputing, 2023, 79 : 17759 - 17783
  • [46] Abstractive social media text summarization using selective reinforced Seq2Seq attention model
    Liang, Zeyu
    Du, Junping
    Li, Chaoyang
    NEUROCOMPUTING, 2020, 410 : 432 - 440
  • [47] A Knowledge Graph Summarization Model Integrating Attention Alignment and Momentum Distillation
    Wang, Zhao
    Zhao, Xia
    JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2025, 29 (01) : 205 - 214
  • [48] Abstractive Financial News Summarization via Transformer-BiLSTM Encoder and Graph Attention-Based Decoder
    Li, Haozhou
    Peng, Qinke
    Mou, Xu
    Wang, Ying
    Zeng, Zeyuan
    Bashir, Muhammad Fiaz
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 3190 - 3205
  • [49] Factual Consistency Assessment Evaluation Model for Text Summarization Based on Multi-Attention Mechanism
    Wei, Chuyuan
    Zhang, Xinxian
    Wang, Zhiyuan
    Li, Jinzhe
    Liu, Jie
    Computer Engineering and Applications, 2023, 59 (07) : 163 - 170
  • [50] An eye-tracking attention based model for abstractive text headline
    Xie, Jiehang
    Wang, Xiaoming
    Wang, Xinyan
    Pang, Guangyao
    Qin, Xueyang
    COGNITIVE SYSTEMS RESEARCH, 2019, 58 (253-264): : 253 - 264