Decoupled differentiable graph neural architecture search

被引:0
|
作者
Chen, Jiamin [1 ]
Gao, Jianliang [1 ]
Wu, Zhenpeng [1 ]
Al-Sabri, Raeed [1 ]
Oloulade, Babatounde Moctard [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Changsha 410083, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural architecture search; Decoupled differentiable optimization; Supernet pruning; Graph neural network;
D O I
10.1016/j.ins.2024.120700
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The differentiable graph neural architecture search (GNAS) effectively designs graph neural networks (GNNs) efficiently and automatically with excellent performance based on different graph data distributions. Given a GNN search space containing multiple GNN component operation candidates, the differentiable GNAS method builds a mixed supernet using learnable architecture parameters multiplied by the GNN component operation candidates. When the mixed supernet completes optimization, the mixed supernet is pruned based on the best architecture parameters to efficiently identify the optimal GNN architecture in the GNN search space. However, the multiplicative relationship between the architecture parameters and the GNN component operation candidates introduces a coupled optimization bias into the weight optimization process of the mixed supernet GNN component operation candidates. This bias results in differentiable GNAS performance degradation. To solve the problem of coupled optimization bias in the previous differentiable GNAS method, we propose the D ecoupled D ifferentiable G raph N eural A rchitecture S earch (D 2 GNAS). It utilizes the Gumbel distribution as a bridge to decouple the weights optimization of supernet GNN component candidate operation and architecture parameters for constructing the decoupled differentiable GNN architecture sampler. The sampler is capable of selecting promising GNN architectures based on architecture parameters treated as sampling probabilities, and it is further optimized through the validation gradients derived from the sampled GNN architectures. Simultaneously, D 2 GNAS builds a singlepath supernet with a pruning strategy to compress the supernet progressively to improve search efficiency further. We conduct extensive experiments on multiple benchmark graphs. The experimental findings demonstrate that D 2 GNAS outperforms all established baseline methods, both manual GNN and GNAS methods, in terms of performance. Additionally, D 2 GNAS has a lower time complexity than previous differentiable GNAS methods. Based on the fair GNN search space, it achieves an average 5x efficiency improvement. Codes are available at https:// github.com/AutoMachine0/D2GNAS.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] REP: An Interpretable Robustness Enhanced Plugin for Differentiable Neural Architecture Search
    Feng, Yuqi
    Sun, Yanan
    Yen, Gary G.
    Tan, Kay Chen
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2025, 37 (05) : 2888 - 2902
  • [42] Auto-Navigator: Decoupled Neural Architecture Search for Visual Navigation
    Tang, Tianqi
    Yu, Xin
    Dong, Xuanyi
    Yang, Yi
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3742 - 3751
  • [43] Depth-adaptive graph neural architecture search for graph classification
    Wu, Zhenpeng
    Chen, Jiamin
    Al-Sabri, Raeed
    Oloulade, Babatounde Moctard
    Gao, Jianliang
    KNOWLEDGE-BASED SYSTEMS, 2024, 301
  • [44] NAS-Bench-Graph: Benchmarking Graph Neural Architecture Search
    Qin, Yijian
    Zhang, Ziwei
    Wang, Xin
    Zhang, Zeyang
    Zhu, Wenwu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [45] GQNAS: Graph Q Network for Neural Architecture Search
    Qin, Yijian
    Wang, Xin
    Cui, Peng
    Zhu, Wenwu
    2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 1288 - 1293
  • [46] Dynamic Heterogeneous Graph Attention Neural Architecture Search
    Zhang, Zeyang
    Zhang, Ziwei
    Wang, Xin
    Qin, Yijian
    Qin, Zhou
    Zhu, Wenwu
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 9, 2023, : 11307 - 11315
  • [47] Evolving graph convolutional networks for neural architecture search
    Kyriakides, George
    Margaritis, Konstantinos
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (02): : 899 - 909
  • [48] Multimodal Continual Graph Learning with Neural Architecture Search
    Cai, Jie
    Wang, Xin
    Guan, Chaoyu
    Tang, Yateng
    Xu, Jin
    Zhong, Bin
    Zhu, Wenwu
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 1292 - 1300
  • [49] Graph Neural Architecture Search Under Distribution Shifts
    Qin, Yijian
    Wang, Xin
    Zhang, Ziwei
    Xie, Pengtao
    Zhu, Wenwu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [50] Large-Scale Graph Neural Architecture Search
    Guan, Chaoyu
    Wang, Xin
    Chen, Hong
    Zhang, Ziwei
    Zhu, Wenwu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,