An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge

被引:0
|
作者
Li, Yuanyuan [1 ]
Huang, Yuan [1 ]
Huang, Weijian [1 ]
Yu, Junhao [1 ]
Huang, Zheng [1 ]
机构
[1] Hebei Univ Engn, Sch Informat & Elect Engn, Handan 056038, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 07期
关键词
abstractive summarization; joint-attention mechanism; prior knowledge; reinforcement learning;
D O I
10.3390/app13074610
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
An abstractive summarization model based on the joint-attention mechanism and a priori knowledge is proposed to address the problems of the inadequate semantic understanding of text and summaries that do not conform to human language habits in abstractive summary models. Word vectors that are most relevant to the original text should be selected first. Second, the original text is represented in two dimensions-word-level and sentence-level, as word vectors and sentence vectors, respectively. After this processing, there will be not only a relationship between word-level vectors but also a relationship between sentence-level vectors, and the decoder discriminates between word-level and sentence-level vectors based on their relationship with the hidden state of the decoder. Then, the pointer generation network is improved using a priori knowledge. Finally, reinforcement learning is used to improve the quality of the generated summaries. Experiments on two classical datasets, CNN/DailyMail and DUC 2004, show that the model has good performance and effectively improves the quality of generated summaries.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Abstractive Document Summarization via Neural Model with Joint Attention
    Hou, Liwei
    Hu, Po
    Bei, Chao
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2017, 2018, 10619 : 329 - 338
  • [2] Contrastive Attention Mechanism for Abstractive Sentence Summarization
    Duan, Xiangyu
    Yu, Hongfei
    Yin, Mingming
    Zhang, Min
    Luo, Weihua
    Zhang, Yue
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3044 - 3053
  • [3] Attention based Abstractive Summarization of Malayalam Document
    Nambiar, Sindhya K.
    Peter, David S.
    Idicula, Sumam Mary
    AI IN COMPUTATIONAL LINGUISTICS, 2021, 189 : 250 - 257
  • [4] Diversity driven attention model for query-based abstractive summarization
    Nema, Preksha
    Khapra, Mitesh M.
    Laha, Anirban
    Ravindran, Balaraman
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1063 - 1072
  • [5] Abstractive Text Summarization Using Enhanced Attention Model
    Roul, Rajendra Kumar
    Joshi, Pratik Madhav
    Sahoo, Jajati Keshari
    INTELLIGENT HUMAN COMPUTER INTERACTION (IHCI 2019), 2020, 11886 : 63 - 76
  • [6] Attention History-based Attention for Abstractive Text Summarization
    Lee, Hyunsoo
    Choi, YunSeok
    Lee, Jee-Hyong
    PROCEEDINGS OF THE 35TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING (SAC'20), 2020, : 1075 - 1081
  • [7] Abstractive text summarization model combining a hierarchical attention mechanism and multiobjective reinforcement learning
    Sun, Yujia
    Platos, Jan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 248
  • [8] An abstractive text summarization technique using transformer model with self-attention mechanism
    Sandeep Kumar
    Arun Solanki
    Neural Computing and Applications, 2023, 35 : 18603 - 18622
  • [9] An abstractive text summarization technique using transformer model with self-attention mechanism
    Kumar, Sandeep
    Solanki, Arun
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (25): : 18603 - 18622
  • [10] Self-Attention Guided Copy Mechanism for Abstractive Summarization
    Xu, Song
    Li, Haoran
    Yuan, Peng
    Wu, Youzheng
    He, Xiaodong
    Zhou, Bowen
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1355 - 1362