Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning

被引:0
|
作者
Wan, Sheng [1 ,2 ]
Pan, Shirui [3 ]
Yang, Jian [1 ,2 ]
Gong, Chen [1 ,2 ,4 ]
机构
[1] Nanjing Univ Sci & Technol, PCA Lab, Key Lab Intelligent Percept & Syst High Dimens In, Minist Educ, Nanjing, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Jiangsu Key Lab Image & Video Understanding Socia, Nanjing, Peoples R China
[3] Monash Univ, Fac IT, Clayton, Vic, Australia
[4] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Peoples R China
关键词
CLASSIFICATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph. As one of the most popular graph-based SSL approaches, the recently proposed Graph Convolutional Networks (GCNs) have gained remarkable progress by combining the sound expressiveness of neural networks with graph structure. Nevertheless, the existing graph-based methods do not directly address the core problem of SSL, i.e., the shortage of supervision, and thus their performances are still very limited. To accommodate this issue, a novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure. Firstly, by designing a semisupervised contrastive loss, improved node representations can be generated via maximizing the agreement between different views of the same data or the data from the same class. Therefore, the rich unlabeled data and the scarce yet valuable labeled data can jointly provide abundant supervision information for learning discriminative node representations, which helps improve the subsequent classification result. Secondly, the underlying determinative relationship between the data features and input graph topology is extracted as supplementary supervision signals for SSL via using a graph generative loss related to the input features. Intensive experimental results on a variety of real-world datasets firmly verify the effectiveness of our algorithm compared with other state-of-the-art methods.
引用
收藏
页码:10049 / 10057
页数:9
相关论文
共 50 条
  • [1] Dynamic graph convolutional networks by semi-supervised contrastive learning
    Zhang, Guolin
    Hu, Zehui
    Wen, Guoqiu
    Ma, Junbo
    Zhu, Xiaofeng
    PATTERN RECOGNITION, 2023, 139
  • [2] Pseudo Contrastive Learning for graph-based semi-supervised learning
    Lu, Weigang
    Guan, Ziyu
    Zhao, Wei
    Yang, Yaming
    Lv, Yuanhai
    Xing, Lining
    Yu, Baosheng
    Tao, Dacheng
    NEUROCOMPUTING, 2025, 624
  • [3] Graph-Based Semi-Supervised Learning as a Generative Model
    He, Jingrui
    Carbonell, Jaime
    Liu, Yan
    20TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2007, : 2492 - 2497
  • [4] Dual Graph Convolutional Networks for Graph-Based Semi-Supervised Classification
    Zhuang, Chenyi
    Ma, Qiang
    WEB CONFERENCE 2018: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW2018), 2018, : 499 - 508
  • [5] A Flexible Generative Framework for Graph-based Semi-supervised Learning
    Ma, Jiaqi
    Tang, Weijing
    Zhu, Ji
    Mei, Qiaozhu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [6] Graph-based semi-supervised learning
    Subramanya, Amarnag
    Talukdar, Partha Pratim
    Synthesis Lectures on Artificial Intelligence and Machine Learning, 2014, 29 : 1 - 126
  • [7] Graph-based semi-supervised learning
    Changshui Zhang
    Fei Wang
    Artificial Life and Robotics, 2009, 14 (4) : 445 - 448
  • [8] Graph-based semi-supervised learning
    Zhang, Changshui
    Wang, Fei
    ARTIFICIAL LIFE AND ROBOTICS, 2009, 14 (04) : 445 - 448
  • [9] Semi-supervised Learning with Graph Convolutional Networks Based on Hypergraph
    Yangding Li
    Yingying Wan
    Xingyi Liu
    Neural Processing Letters, 2022, 54 : 2629 - 2644
  • [10] Semi-supervised Learning with Graph Convolutional Networks Based on Hypergraph
    Li, Yangding
    Wan, Yingying
    Liu, Xingyi
    NEURAL PROCESSING LETTERS, 2022, 54 (04) : 2629 - 2644