FairSample: Training Fair and Accurate Graph Convolutional Neural Networks Efficiently

被引:0
|
作者
Cong, Zicun [1 ]
Shi, Baoxu [2 ]
Li, Shan [2 ]
Yang, Jaewon [2 ]
He, Qi [2 ]
Pei, Jian [1 ]
机构
[1] Simon Fraser Univ, Sch Comp Sci, Burnaby, BC V5A 1S6, Canada
[2] LinkedIn Corp, Sunnnyvale, CA 94085 USA
关键词
Computational modeling; Task analysis; Training; Social networking (online); Predictive models; Costs; Neural networks; Graph neural network; sampling; fairness;
D O I
10.1109/TKDE.2023.3306378
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fairness in Graph Convolutional Neural Networks (GCNs) becomes a more and more important concern as GCNs are adopted in many crucial applications. Societal biases against sensitive groups may exist in many real world graphs. GCNs trained on those graphs may be vulnerable to being affected by such biases. In this paper, we adopt the well-known fairness notion of demographic parity and tackle the challenge of training fair and accurate GCNs efficiently. We present an in-depth analysis on how graph structure bias, node attribute bias, and model parameters may affect the demographic parity of GCNs. Our insights lead to FairSample, a framework that jointly mitigates the three types of biases. We employ two intuitive strategies to rectify graph structures. First, we inject edges across nodes that are in different sensitive groups but similar in node features. Second, to enhance model fairness and retain model quality, we develop a learnable neighbor sampling policy using reinforcement learning. To address the bias in node features and model parameters, FairSample is complemented by a regularization objective to optimize fairness.
引用
收藏
页码:1537 / 1551
页数:15
相关论文
共 50 条
  • [41] Training spiking neural networks efficiently and with ease in mlGeNN
    Knight, James
    Nowotny, Thomas
    JOURNAL OF COMPUTATIONAL NEUROSCIENCE, 2024, 52 : S95 - S96
  • [42] Generalization Guarantee of Training Graph Convolutional Networks with Graph Topology Sampling
    Li, Hongkang
    Wang, Meng
    Liu, Sijia
    Chen, Pin-Yu
    Xiong, Jinjun
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [43] A Hybrid Model for Soybean Yield Prediction Integrating Convolutional Neural Networks, Recurrent Neural Networks, and Graph Convolutional Networks
    Ingole, Vikram S.
    Kshirsagar, Ujwala A.
    Singh, Vikash
    Yadav, Manish Varun
    Krishna, Bipin
    Kumar, Roshan
    COMPUTATION, 2025, 13 (01)
  • [44] Stochastic Training of Graph Convolutional Networks with Variance Reduction
    Chen, Jianfei
    Zhu, Jun
    Song, Le
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [45] Certifiable Robustness and Robust Training for Graph Convolutional Networks
    Zuegner, Daniel
    Guennemann, Stephan
    KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 246 - 256
  • [46] Batch virtual adversarial training for graph convolutional networks
    Deng, Zhijie
    Dong, Yinpeng
    Zhu, Jun
    AI OPEN, 2023, 4 : 73 - 79
  • [47] On Provable Benefits of Depth in Training Graph Convolutional Networks
    Cong, Weilin
    Ramezani, Morteza
    Mahdavi, Mehrdad
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [48] JOINT TRAINING OF CONVOLUTIONAL AND NON-CONVOLUTIONAL NEURAL NETWORKS
    Soltau, Hagen
    Saon, George
    Sainath, Tara N.
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [49] Convolutional neural networks for accurate estimation of canopy cover
    Puig, F.
    Perea, R. Gonzalez
    Daccache, A.
    Soriano, M. A.
    Diaz, J. A. Rodriguez
    SMART AGRICULTURAL TECHNOLOGY, 2025, 10
  • [50] Efficient and accurate compound scaling for convolutional neural networks
    Lin, Chengmin
    Yang, Pengfei
    Wang, Quan
    Qiu, Zeyu
    Lv, Wenkai
    Wang, Zhenyi
    NEURAL NETWORKS, 2023, 167 : 787 - 797