Decoupled differentiable graph neural architecture search

被引:0
|
作者
Chen, Jiamin [1 ]
Gao, Jianliang [1 ]
Wu, Zhenpeng [1 ]
Al-Sabri, Raeed [1 ]
Oloulade, Babatounde Moctard [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Changsha 410083, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural architecture search; Decoupled differentiable optimization; Supernet pruning; Graph neural network;
D O I
10.1016/j.ins.2024.120700
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The differentiable graph neural architecture search (GNAS) effectively designs graph neural networks (GNNs) efficiently and automatically with excellent performance based on different graph data distributions. Given a GNN search space containing multiple GNN component operation candidates, the differentiable GNAS method builds a mixed supernet using learnable architecture parameters multiplied by the GNN component operation candidates. When the mixed supernet completes optimization, the mixed supernet is pruned based on the best architecture parameters to efficiently identify the optimal GNN architecture in the GNN search space. However, the multiplicative relationship between the architecture parameters and the GNN component operation candidates introduces a coupled optimization bias into the weight optimization process of the mixed supernet GNN component operation candidates. This bias results in differentiable GNAS performance degradation. To solve the problem of coupled optimization bias in the previous differentiable GNAS method, we propose the D ecoupled D ifferentiable G raph N eural A rchitecture S earch (D 2 GNAS). It utilizes the Gumbel distribution as a bridge to decouple the weights optimization of supernet GNN component candidate operation and architecture parameters for constructing the decoupled differentiable GNN architecture sampler. The sampler is capable of selecting promising GNN architectures based on architecture parameters treated as sampling probabilities, and it is further optimized through the validation gradients derived from the sampled GNN architectures. Simultaneously, D 2 GNAS builds a singlepath supernet with a pruning strategy to compress the supernet progressively to improve search efficiency further. We conduct extensive experiments on multiple benchmark graphs. The experimental findings demonstrate that D 2 GNAS outperforms all established baseline methods, both manual GNN and GNAS methods, in terms of performance. Additionally, D 2 GNAS has a lower time complexity than previous differentiable GNAS methods. Based on the fair GNN search space, it achieves an average 5x efficiency improvement. Codes are available at https:// github.com/AutoMachine0/D2GNAS.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] DDNAS: Discretized Differentiable Neural Architecture Search for Text Classification
    Chen, Kuan-Chun
    Li, Cheng-Te
    Lee, Kuo-Jung
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2023, 14 (05)
  • [22] Att-DARTS: Differentiable Neural Architecture Search for Attention
    Nakai, Kohei
    Matsubara, Takashi
    Uehara, Kuniaki
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [23] Efficient and Lightweight Visual Tracking with Differentiable Neural Architecture Search
    Gao, Peng
    Liu, Xiao
    Sang, Hong-Chuan
    Wang, Yu
    Wang, Fei
    ELECTRONICS, 2023, 12 (17)
  • [24] MDARTS: Multi-objective Differentiable Neural Architecture Search
    Kim, Sunghoon
    Kwon, Hyunjeong
    Kwon, Eunji
    Choi, Youngchang
    Oh, Tae-Hyun
    Kang, Seokhyeong
    PROCEEDINGS OF THE 2021 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE 2021), 2021, : 1344 - 1349
  • [25] Efficient Automation of Neural Network Design: A Survey on Differentiable Neural Architecture Search
    Heuillet, Alexandre
    Nasser, Ahmad
    Arioui, Hichem
    Tabia, Hedi
    ACM COMPUTING SURVEYS, 2024, 56 (11)
  • [26] EFFICIENT DECOUPLED NEURAL ARCHITECTURE SEARCH BY STRUCTURE AND OPERATION SAMPLING
    Lee, Heung-Chang
    Kim, Do-Guk
    Han, Bohyung
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4222 - 4226
  • [27] Universal Binary Neural Networks Design by Improved Differentiable Neural Architecture Search
    Tan, Menghao
    Gao, Weifeng
    Li, Hong
    Xie, Jin
    Gong, Maoguo
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (10) : 9153 - 9165
  • [28] Cyclic Differentiable Architecture Search
    Yu, Hongyuan
    Peng, Houwen
    Huang, Yan
    Fu, Jianlong
    Du, Hao
    Wang, Liang
    Ling, Haibin
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (01) : 211 - 228
  • [29] Adversarially Robust Neural Architecture Search for Graph Neural Networks
    Xie, Beini
    Chang, Heng
    Zhang, Ziwei
    Wang, Xin
    Wang, Daxin
    Zhang, Zhiqiang
    Ying, Rex
    Zhu, Wenwu
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 8143 - 8152
  • [30] Differentiable quantum architecture search
    Zhang, Shi-Xin
    Hsieh, Chang-Yu
    Zhang, Shengyu
    Yao, Hong
    QUANTUM SCIENCE AND TECHNOLOGY, 2022, 7 (04)