Self-supervised Graph-level Representation Learning with Adversarial Contrastive Learning

被引:41
|
作者
Luo, Xiao [1 ]
Ju, Wei [2 ]
Gu, Yiyang [2 ]
Mao, Zhengyang [2 ]
Liu, Luchen [2 ]
Yuan, Yuhui [3 ]
Zhang, Ming [2 ]
机构
[1] Univ Calif Los Angeles, Dept Comp Sci, Los Angeles, CA 90095 USA
[2] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[3] Microsoft Res Asia, Beijing, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Graph representation learning; graph neural networks; contrastive learning; self-supervised learning; PREDICTION;
D O I
10.1145/3624018
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The recently developed unsupervised graph representation learning approaches apply contrastive learning into graph-structured data and achieve promising performance. However, these methods mainly focus on graph augmentation for positive samples, while the negative mining strategies for graph contrastive learning are less explored, leading to sub-optimal performance. To tackle this issue, we propose a Graph Adversarial Contrastive Learning (GraphACL) scheme that learns a bank of negative samples for effective self-supervised whole-graph representation learning. Our GraphACL consists of (i) a graph encoding branch that generates the representations of positive samples and (ii) an adversarial generation branch that produces a bank of negative samples. To generate more powerful hard negative samples, our method minimizes the contrastive loss during encoding updating while maximizing the contrastive loss adversarially over the negative samples for providing the challenging contrastive task. Moreover, the quality of representations produced by the adversarial generation branch is enhanced through the regularization of carefully designed bank divergence loss and bank orthogonality loss. We optimize the parameters of the graph encoding branch and adversarial generation branch alternately. Extensive experiments on 14 real-world benchmarks on both graph classification and transfer learning tasks demonstrate the effectiveness of the proposed approach over existing graph self-supervised representation learning methods.
引用
收藏
页数:23
相关论文
共 50 条
  • [31] Contrastive Self-supervised Representation Learning Using Synthetic Data
    Dong-Yu She
    Kun Xu
    International Journal of Automation and Computing, 2021, 18 (04) : 556 - 567
  • [32] Contrastive Self-supervised Representation Learning Using Synthetic Data
    Dong-Yu She
    Kun Xu
    International Journal of Automation and Computing, 2021, 18 : 556 - 567
  • [33] Self-supervised Segment Contrastive Learning for Medical Document Representation
    Abro, Waheed Ahmed
    Kteich, Hanane
    Bouraoui, Zied
    ARTIFICIAL INTELLIGENCE IN MEDICINE, PT I, AIME 2024, 2024, 14844 : 312 - 321
  • [34] Self-supervised Contrastive Graph Views for Learning Neuron-Level Circuit Network
    Li, Junchi
    Wan, Guojia
    Liao, Minghui
    Liao, Fei
    Du, Bo
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2024, PT XI, 2024, 15011 : 590 - 600
  • [35] Self-supervised Consensus Representation Learning for Attributed Graph
    Liu, Changshu
    Wen, Liangjian
    Kang, Zhao
    Luo, Guangchun
    Tian, Ling
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2654 - 2662
  • [36] Mixing up contrastive learning: Self-supervised representation learning for time series
    Wickstrom, Kristoffer
    Kampffmeyer, Michael
    Mikalsen, Karl Oyvind
    Jenssen, Robert
    PATTERN RECOGNITION LETTERS, 2022, 155 : 54 - 61
  • [37] Self-supervised Graph Representation Learning with Variational Inference
    Liao, Zihan
    Liang, Wenxin
    Liu, Han
    Mu, Jie
    Zhang, Xianchao
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT III, 2021, 12714 : 116 - 127
  • [38] TimesURL: Self-Supervised Contrastive Learning for Universal Time Series Representation Learning
    Liu, Jiexi
    Chen, Songcan
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13918 - 13926
  • [39] Self-supervised graph representation learning via bootstrapping
    Che, Feihu
    Yang, Guohua
    Zhang, Dawei
    Tao, Jianhua
    Liu, Tong
    NEUROCOMPUTING, 2021, 456 (456) : 88 - 96
  • [40] Simple Self-supervised Multiplex Graph Representation Learning
    Mo, Yujie
    Chen, Yuhuan
    Peng, Liang
    Shi, Xiaoshuang
    Zhu, Xiaofeng
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 3301 - 3309