Evolutionary Neural Architecture Search for Transferable Networks

被引:1
|
作者
Zhou, Xun [1 ]
Liu, Songbai [2 ]
Qin, A. K. [3 ]
Tan, Kay Chen [1 ]
机构
[1] Hong Kong Polytech Univ, Dept Comp, Hung Hom, Hong Kong 999077, Peoples R China
[2] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[3] Swinburne Univ Technol, Dept Comp Technol, Hawthorn, Vic 3122, Australia
基金
中国国家自然科学基金; 澳大利亚研究理事会;
关键词
Evolutionary algorithm; neural architecture search; transferable architecture; TRANSFER OPTIMIZATION; GENETIC ALGORITHM;
D O I
10.1109/TETCI.2024.3427763
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The recent proliferation of edge computing has led to the deployment of deep neural networks (DNNs) on edge devices like smartphones and IoT devices to serve end users. However, developing the most suitable DNN model for each on-device task is nontrivial, due to data governance of these tasks and data heterogeneity across them. Existing approaches tackle this issue by learning task-specific models on the device, but this requires substantial computational resources, exacerbating the computational and energy demands on edge devices. This research strives to enhance the deployment efficiency of advanced models on edge devices, with a specific focus on reducing the on-device learning cost. In pursuit of this goal, we propose a category-specific but task-agnostic evolutionary neural architecture search (CSTA-ENAS) method. This method can utilize the available datasets from multiple other tasks in the same category as on-device tasks to design a transferable architecture on the server. Then, this architecture only requires light on-device fine-tuning to satisfactorily solve all different on-device tasks, significantly reducing the on-device learning time and related energy consumption. To improve the search efficiency of our method, a supernet-based partial training strategy is proposed to reduce the evaluation cost for candidate architectures. To showcase the effectiveness of CSTA-ENAS, we build transferable DNN models and evaluate their accuracies on a set of new image classification tasks. Our models demonstrate competitive performance compared to most of the existing task-specific models and transferable models while requiring fewer on-device computational resources.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] A surrogate evolutionary neural architecture search algorithm for graph neural networks
    Liu, Yang
    Liu, Jing
    APPLIED SOFT COMPUTING, 2023, 144
  • [2] Multi-objective Evolutionary Neural Architecture Search for Recurrent Neural Networks
    Booysen, Reinhard
    Bosman, Anna Sergeevna
    NEURAL PROCESSING LETTERS, 2024, 56 (04)
  • [3] Genetic-GNN: Evolutionary architecture search for Graph Neural Networks
    Shi, Min
    Tang, Yufei
    Zhu, Xingquan
    Huang, Yu
    Wilson, David
    Zhuang, Yuan
    Liu, Jianxun
    KNOWLEDGE-BASED SYSTEMS, 2022, 247
  • [4] Evolutionary approximation and neural architecture search
    Pinos, Michal
    Mrazek, Vojtech
    Sekanina, Lukas
    GENETIC PROGRAMMING AND EVOLVABLE MACHINES, 2022, 23 (03) : 351 - 374
  • [5] A Survey on Evolutionary Neural Architecture Search
    Liu, Yuqiao
    Sun, Yanan
    Xue, Bing
    Zhang, Mengjie
    Yen, Gary G.
    Tan, Kay Chen
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (02) : 550 - 570
  • [6] Evolutionary Neural Architecture Search and Applications
    Sun, Yanan
    Zhang, Mengjie
    Yen, Gary G.
    IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2021, 16 (03) : 8 - 9
  • [7] Evolutionary Recurrent Neural Architecture Search
    Tian, Shuo
    Hu, Kai
    Guo, Shasha
    Li, Shiming
    Wang, Lei
    Xu, Weixia
    IEEE EMBEDDED SYSTEMS LETTERS, 2021, 13 (03) : 110 - 113
  • [8] Evolutionary approximation and neural architecture search
    Michal Pinos
    Vojtech Mrazek
    Lukas Sekanina
    Genetic Programming and Evolvable Machines, 2022, 23 : 351 - 374
  • [9] A General-Purpose Transferable Predictor for Neural Architecture Search
    Han, Fred X.
    Mills, Keith G.
    Chudak, Fabian
    Riahi, Parsa
    Salameh, Mohammad
    Zhang, Jialin
    Lul, Wei
    Jui, Shangling
    Niu, Di
    PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 721 - 729
  • [10] Quantum-Inspired Evolutionary Algorithm for Convolutional Neural Networks Architecture Search
    Ye, Weiliang
    Liu, Ruijiao
    Li, Yangyang
    Jiao, Licheng
    2020 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2020,