IMCGNN: Information Maximization based Continual Graph Neural Networks for inductive node classification

被引:1
|
作者
Yuan, Qiao [1 ,2 ]
Guan, Sheng-Uei [2 ]
Luo, Tianlun [1 ,2 ]
Man, Ka Lok [2 ]
Lim, Eng Gee [2 ]
机构
[1] UNIV LIVERPOOL, LIVERPOOL L69 3BX, England
[2] Xian Jiaotong Liverpool Univ, Suzhou 215123, Peoples R China
关键词
Continual graph learning; Experience replay; Deep learning; HIPPOCAMPUS;
D O I
10.1016/j.neucom.2025.129362
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual graph learning is an emerging topic that enables models to incrementally acquire new knowledge while retaining prior experiences. It efficiently adapts to model evolving dynamic graphs, avoiding the computational burden of training from scratch. The key distinction of CGL from conventional continual learning is the interdependence of samples in graph-structured data versus the independence in conventional learning. Consequently, continual graph learning techniques should emphasize consolidating and leveraging the topological information in graph-structured data. Current methods inadequately address this need. Some approaches ignore topological information, resulting in significant information loss. Others attempt to preserve all learned information, leading to overly conservative models. Moreover, most of these methods employ graph neural networks (GNNs) as the base model, yet they fail to fully utilize the topological information learned by GNNs. Additionally, the majority of existing works focus on transductive setting, with inductive continual graph learning problems being scarcely explored. Our proposed Information Maximization based Continual Graph Neural Network (IMCGNN) focuses on inductive task-incremental node classification problems. This proposed work involves a replay module and a regularization module. The former extracts representative subgraphs from previous data, training them jointly with new data to retain historical experiences, whereas the latter preserves topological information and loss-related information with encoded knowledge by imposing elastic penalties on network parameters. Unlike heuristic node selection, our approach utilizes the information theory to guide node selection informing a subgraph, aiming to preserve information better. Comparative experiments with nine baselines using two graph learning models on five benchmark datasets demonstrate the effectiveness and efficiency of our method.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Mutual Information Maximization in Graph Neural Networks
    Di, Xinhan
    Yu, Pengqian
    Bu, Rui
    Sun, Mingchao
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [2] Ensembling Graph Neural Networks for Node Classification
    Lin, Ke-Ao
    Xie, Xiao-Zhu
    Weng, Wei
    Chen, Yong
    Journal of Network Intelligence, 2024, 9 (02): : 804 - 818
  • [3] On Calibration of Graph Neural Networks for Node Classification
    Liu, Tong
    Liu, Yushan
    Hildebrandt, Marcel
    Joblin, Mitchell
    Li, Hang
    Tresp, Volker
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [4] Global Attention-Based Graph Neural Networks for Node Classification
    Chen, Jiusheng
    Fang, Chengyuan
    Zhang, Xiaoyu
    NEURAL PROCESSING LETTERS, 2023, 55 (04) : 4127 - 4150
  • [5] Global Attention-Based Graph Neural Networks for Node Classification
    Jiusheng Chen
    Chengyuan Fang
    Xiaoyu Zhang
    Neural Processing Letters, 2023, 55 : 4127 - 4150
  • [6] Simplifying approach to node classification in Graph Neural Networks
    Maurya, Sunil Kumar
    Liu, Xin
    Murata, Tsuyoshi
    JOURNAL OF COMPUTATIONAL SCIENCE, 2022, 62
  • [7] Exploring Node Classification Uncertainty in Graph Neural Networks
    Islam, Md. Farhadul
    Zabeen, Sarah
    Bin Rahman, Fardin
    Islam, Md. Azharul
    Bin Kibria, Fahmid
    Manab, Meem Arafat
    Karim, Dewan Ziaul
    Rasel, Annajiat Alim
    PROCEEDINGS OF THE 2023 ACM SOUTHEAST CONFERENCE, ACMSE 2023, 2023, : 186 - 190
  • [8] Graph neural networks in node classification: survey and evaluation
    Xiao, Shunxin
    Wang, Shiping
    Dai, Yuanfei
    Guo, Wenzhong
    MACHINE VISION AND APPLICATIONS, 2022, 33 (01)
  • [9] Graph neural networks in node classification: survey and evaluation
    Shunxin Xiao
    Shiping Wang
    Yuanfei Dai
    Wenzhong Guo
    Machine Vision and Applications, 2022, 33
  • [10] Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization
    Zhu, Qi
    Yang, Carl
    Xu, Yidan
    Wang, Haonan
    Zhang, Chao
    Han, Jiawei
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,