EDANAS: Adaptive Neural Architecture Search for Early Exit Neural Networks

被引:3
|
作者
Gambella, Matteo [1 ]
Roveri, Manuel [1 ]
机构
[1] Politecn Milan, Dipartimento Elettron Informaz & Bioingn, Milan, Italy
来源
2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN | 2023年
关键词
Neural Architecture Search (NAS); Once-For-All Network (OFA); Tiny Machine Learning; Early Exit Neural Networks; Adaptive Neural Networks;
D O I
10.1109/IJCNN54540.2023.10191876
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Early Exit Neural Networks (EENNs) endow neural network architectures with auxiliary classifiers to progressively process the input and make decisions at intermediate points of the network. This leads to significant benefits in terms of effectiveness and efficiency such as the reduction of the average inference time as well as the mitigation of overfitting and vanishing gradient phenomena. Currently, the design of EENNs, which is a very complex and time-consuming task, is carried out manually by experts. This is where Neural Architecture Search (NAS) comes into play by automatically designing neural network architectures focusing also on the optimization of the computational demand of these networks. These requirements are crucial in the design of machine and deep learning solutions meant to operate in devices constrained by the technology (computation, memory, and energy) such as Internet-Of-Things and embedded systems. Interestingly, few NAS solutions have taken into account the design of early exiting mechanisms. This work introduces, for the first time in the literature, a framework called Early exit aDAptive Neural Architecture Search (EDANAS) for the automatic design of both the EENN architecture and the parameters that manage its early exit mechanism in order to optimize both the accuracy in the classification tasks and the computational demand. EDANAS has proven to compete with expert-designed early exit solutions paving the way for a new era in the prominent field of NAS.
引用
收藏
页数:8
相关论文
共 50 条
  • [41] Fully Pipelined FPGA Acceleration of Binary Convolutional Neural Networks with Neural Architecture Search
    Ji, Mengfei
    Al-Ars, Zaid
    Chang, Yuchun
    Zhang, Baolin
    JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2024, 33 (10)
  • [42] Neural Architecture Search for LF-MMI Trained Time Delay Neural Networks
    Hu, Shoukang
    Xie, Xurong
    Cui, Mingyu
    Deng, Jiajun
    Liu, Shansong
    Yu, Jianwei
    Geng, Mengzhe
    Liu, Xunying
    Meng, Helen
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2022, 30 : 1093 - 1107
  • [43] Neural Graph Embedding for Neural Architecture Search
    Li, Wei
    Gong, Shaogang
    Zhu, Xiatian
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4707 - 4714
  • [44] Lightweight Neural Architecture Search for Temporal Convolutional Networks at the Edge
    Risso, Matteo
    Burrello, Alessio
    Conti, Francesco
    Lamberti, Lorenzo
    Chen, Yukai
    Benini, Luca
    Macii, Enrico
    Poncino, Massimo
    Pagliari, Daniele Jahier
    IEEE TRANSACTIONS ON COMPUTERS, 2023, 72 (03) : 744 - 758
  • [45] NASGuard: A Novel Accelerator Architecture for Robust Neural Architecture Search (NAS) Networks
    Wang, Xingbin
    Zhao, Boyan
    Hou, Rui
    Awad, Amro
    Tian, Zhihong
    Meng, Dan
    2021 ACM/IEEE 48TH ANNUAL INTERNATIONAL SYMPOSIUM ON COMPUTER ARCHITECTURE (ISCA 2021), 2021, : 776 - 789
  • [46] Zero Time Waste: Recycling Predictions in Early Exit Neural Networks
    Wolczyk, Maciej
    Wojcik, Bartosz
    Balazy, Klaudia
    Podolak, Igor
    Tabor, Jacek
    Smieja, Marek
    Trzcinski, Tomasz
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [47] SEENN: Towards Temporal Spiking Early-Exit Neural Networks
    Li, Yuhang
    Geller, Tamar
    Kim, Youngeun
    Panda, Priyadarshini
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [48] Early-Exit with Class Exclusion for Efficient Inference of Neural Networks
    Wang, Jingcun
    Li, Bing
    Zhang, Grace Li
    2024 IEEE 6TH INTERNATIONAL CONFERENCE ON AI CIRCUITS AND SYSTEMS, AICAS 2024, 2024, : 263 - 267
  • [49] Selective Fine-Tuning on a Classifier Ensemble: Realizing Adaptive Neural Networks With a Diversified Multi-Exit Architecture
    Hirose, Kazutoshi
    Takamaeda-Yamazaki, Shinya
    Yu, Jaehoon
    Motomura, Masato
    IEEE ACCESS, 2021, 9 : 6179 - 6187
  • [50] MNGNAS: Distilling Adaptive Combination of Multiple Searched Networks for One-Shot Neural Architecture Search
    Chen, Zhihua
    Qiu, Guhao
    Li, Ping
    Zhu, Lei
    Yang, Xiaokang
    Sheng, Bin
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 13489 - 13508