A Knowledge Graph Summarization Model Integrating Attention Alignment and Momentum Distillation

被引:0
|
作者
Wang, Zhao [1 ]
Zhao, Xia [1 ]
机构
[1] Hebei Univ Econ & Business, Sch Management Sci & Informat Engn, 47 Xuefu Rd, Shijiazhuang 050061, Hebei, Peoples R China
关键词
text summarization; knowledge graph; mo- mentum distillation; attention mechanism alignment;
D O I
10.20965/jaciii.2025.p0205
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The integrated knowledge graph summarization model improves summary performance by combining text features and entity features. However, the model still has the following shortcomings: the knowledge graph data used introduce data noise that deviates from the original text semantics; and the text and knowledge graph entity features cannot be fully integrated. To address these issues, a knowledge graph summarization model integrating attention alignment and momentum distillation (KGS-AAMD) is proposed. The pseudo- targets generated by the momentum distillation model serve as additional supervision signals during training to overcome data noise. The attention-based alignment method lays the foundation for the subsequent full integration of text and entity features by aligning them. Experimental results on two public datasets, namely CNN / Daily Mail and XSum, show that KGS-AAMD surpasses multiple baseline models and ChatGPT in terms of the quality of summary generation, exhibiting significant performance advantages.
引用
收藏
页码:205 / 214
页数:10
相关论文
共 50 条
  • [1] Research on Deep Knowledge Tracing Model Integrating Graph Attention Network
    Zhao, Zhongyuan
    Liu, Zhaohui
    Wang, Bei
    Ouyang, Lijun
    Wang, Can
    Ouyang, Yan
    2022 PROGNOSTICS AND HEALTH MANAGEMENT CONFERENCE, PHM-LONDON 2022, 2022, : 389 - 394
  • [2] Semantic Representation and Attention Alignment for Graph Information Bottleneck in Video Summarization
    Zhong, Rui
    Wang, Rui
    Yao, Wenjin
    Hu, Min
    Dong, Shi
    Munteanu, Adrian
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 4170 - 4184
  • [3] Entity Alignment of Knowledge Graph by Joint Graph Attention and Translation Representation
    Jiang, Shixian
    Nie, Tiezheng
    Shen, Derong
    Kou, Yue
    Yu, Ge
    WEB INFORMATION SYSTEMS AND APPLICATIONS (WISA 2021), 2021, 12999 : 347 - 358
  • [4] A Multi-Role Graph Attention Network for Knowledge Graph Alignment
    Ding, Linyi
    Yuan, Weijie
    Meng, Kui
    Liu, Gongshen
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [5] Knowledge Distillation on Extractive Summarization
    Lin, Ying-Jia
    Tan, Daniel
    Chou, Tzu-Hsuan
    Kao, Hung-Yu
    Wang, Hsin-Yang
    2020 IEEE THIRD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND KNOWLEDGE ENGINEERING (AIKE 2020), 2020, : 71 - 76
  • [6] KTAT: A Complex Embedding Model of Knowledge Graph Integrating Type Information and Attention Mechanism
    Liu, Ying
    Wang, Peng
    Yang, Di
    APPLIED SCIENCES-BASEL, 2023, 13 (13):
  • [7] An Entity Alignment Model for Echinococcosis Knowledge Graph
    Gao, Yuan
    Zhang, Lejun
    Xu, Fei
    Ishdorj, Tseren-Onolt
    Su, YanSen
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT VI, ICIC 2024, 2024, 14880 : 62 - 74
  • [8] Item Recommendation Algorithm Integrating Knowledge Graph and Attention Mechanism
    Junye, Xing
    Xing, Xing
    Zhichun, Jia
    Hongda, Wang
    Jiawen, Liu
    Computer Engineering and Applications, 60 (10): : 173 - 179
  • [9] Attention Temperature Matters in Abstractive Summarization Distillation
    Zhang, Shengqiang
    Zhang, Xingxing
    Bao, Hangbo
    Wei, Furu
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 127 - 141
  • [10] Incorporating External Knowledge into Unsupervised Graph Model for Document Summarization
    Tang, Tiancheng
    Yuan, Tianyi
    Tang, Xinhuai
    Chen, Delai
    ELECTRONICS, 2020, 9 (09) : 1 - 13