A Biased Graph Neural Network Sampler with Near-Optimal Regret

被引:0
|
作者
Zhang, Qingru [1 ]
Wipf, David [2 ]
Gan, Quan [2 ]
Song, Le [1 ,3 ]
机构
[1] Georgia Inst Technol, Atlanta, GA 30332 USA
[2] Amazon Shanghai AI Lab, Shanghai, Peoples R China
[3] Mohamed Bin Zayed Univ Artificial Intelligence, Abu Dhabi, U Arab Emirates
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNN) have recently emerged as a vehicle for applying deep network architectures to graph and relational data. However, given the increasing size of industrial datasets, in many practical situations the message passing computations required for sharing information across GNN layers are no longer scalable. Although various sampling methods have been introduced to approximate full-graph training within a tractable budget, there remain unresolved complications such as high variances and limited theoretical guarantees. To address these issues, we build upon existing work and treat GNN neighbor sampling as a multi-armed bandit problem but with a newly-designed reward function that introduces some degree of bias designed to reduce variance and avoid unstable, possibly-unbounded pay outs. And unlike prior bandit-GNN use cases, the resulting policy leads to near-optimal regret while accounting for the GNN training dynamics introduced by SGD. From a practical standpoint, this translates into lower variance estimates and competitive or superior test accuracy across several benchmarks.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Log(Graph): A Near-Optimal High-Performance Graph Representation
    Besta, Maciej
    Stanojevic, Dimitri
    Zivic, Tijana
    Singh, Jagpreet
    Hoerold, Maurice
    Hoefler, Torsten
    27TH INTERNATIONAL CONFERENCE ON PARALLEL ARCHITECTURES AND COMPILATION TECHNIQUES (PACT 2018), 2018,
  • [22] Near-Optimal Glimpse Sequences for Improved Hard Attention Neural Network Training
    Harvey, William
    Teng, Michael
    Wood, Frank
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [23] Near-Optimal Graph Signal Sampling by Pareto Optimization
    Luo, Dongqi
    Si, Binqiang
    Zhang, Saite
    Yu, Fan
    Zhu, Jihong
    SENSORS, 2021, 21 (04) : 1 - 13
  • [24] Evolutionary programming of near-optimal neural networks
    Lock, D
    Giraud-Carrier, C
    ARTIFICIAL NEURAL NETS AND GENETIC ALGORITHMS, 1999, : 302 - 306
  • [25] Graph neural networks and implicit neural representation for near-optimal topology prediction over irregular design domains
    Seo, Minsik
    Min, Seungjae
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 123
  • [26] Near-Optimal Network Design with Selfish Agents
    Computer Science Department Rensselaer Polytechnic Institute, United States
    不详
    不详
    不详
    Theory of Computing, 2008, 4 : 77 - 109
  • [27] Sincronia: Near-Optimal Network Design for Coflows
    Agarwal, Saksham
    Rajakrishnan, Shijin
    Narayan, Akshay
    Agarwal, Rachit
    Shmoys, David
    Vahdat, Amin
    PROCEEDINGS OF THE 2018 CONFERENCE OF THE ACM SPECIAL INTEREST GROUP ON DATA COMMUNICATION (SIGCOMM '18), 2018, : 16 - 29
  • [28] Self-accelerated Thompson sampling with near-optimal regret upper bound
    Zhu, Zhenyu
    Huang, Liusheng
    Xu, Hongli
    NEUROCOMPUTING, 2020, 399 : 37 - 47
  • [29] SAGA: Sparsity-Agnostic Graph Convolutional Network Acceleration with Near-optimal Workload Balance
    Gandham, Sanjay
    Yin, Lingxiang
    Zheng, Hao
    Lin, Mingjie
    2023 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER AIDED DESIGN, ICCAD, 2023,
  • [30] Behavior and neural basis of near-optimal visual search
    Ma, Wei Ji
    Navalpakkam, Vidhya
    Beck, Jeffrey M.
    van den Berg, Ronald
    Pouget, Alexandre
    NATURE NEUROSCIENCE, 2011, 14 (06) : 783 - U150