Evolving Energy Efficient Convolutional Neural Networks

被引:0
|
作者
Young, Steven R. [1 ]
Johnston, J. Travis [1 ]
Schuman, Catherine D. [1 ]
Devineni, Pravallika [1 ]
Kay, Bill [1 ]
Rose, Derek C. [1 ]
Parsa, Maryam [2 ]
Patton, Robert M. [1 ]
Potok, Thomas E. [1 ]
机构
[1] Oak Ridge Natl Lab, Oak Ridge, TN 37830 USA
[2] Purdue Univ, Dept ECE, W Lafayette, IN 47907 USA
关键词
Terms tit:oral networks; genetic algorithms; high-performance computing; energy efficiency;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As deep neural networks have been deployed in more and more applications over the past half decade and are finding their way into an ever increasing number of operational systems, their energy consumption becomes a concern whether running in the datacenter or in edge devices. Hyperparameter optimization and automated network design for deep learning is a quickly growing field. hot much of the focus has remained only on optimizing for the performance of the machine learning task. In (his work, we demonstrate that the best performing networks created Through this automated network design process have radically different computational characteristics (e.g, energy usage. model size. inference time,. presenting the opportunity to utilize this optimization process to make deep learning networks more energy efficient and deployable to smaller devices. Optimizing for these computational characteristics is critical as the number of applications of deep learning continues to expand.
引用
收藏
页码:4479 / 4485
页数:7
相关论文
共 50 条
  • [41] CiMComp: An Energy Efficient Compute-in-Memory based Comparator for Convolutional Neural Networks
    Kavitha, S.
    Kailath, Binsu J.
    Reniwal, B. S.
    2024 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2024,
  • [42] SLIT: An Energy-Efficient Reconfigurable Hardware Architecture for Deep Convolutional Neural Networks
    Tran, Thi Diem
    Nakashima, Yasuhiko
    IEICE TRANSACTIONS ON ELECTRONICS, 2021, E104C (07) : 319 - 329
  • [43] Energy-Efficient Convolutional Neural Networks with Deterministic Bit-Stream Processing
    Faraji, S. Rasoul
    Najafi, M. Hassan
    Li, Bingzhe
    Lilja, David J.
    Bazargan, Kia
    2019 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2019, : 1757 - 1762
  • [44] A Heterogeneous and Reconfigurable Embedded Architecture for Energy-Efficient Execution of Convolutional Neural Networks
    Luebeck, Konstantin
    Bringmann, Oliver
    ARCHITECTURE OF COMPUTING SYSTEMS - ARCS 2019, 2019, 11479 : 267 - 280
  • [45] Evolving efficient connection for the design of artificial neural networks
    Shi, Min
    Wu, Haifeng
    ARTIFICIAL NEURAL NETWORKS - ICANN 2008, PT II, 2008, 5164 : 909 - +
  • [46] Energy Complexity Model for Convolutional Neural Networks
    Sima, Jiri
    Vidnerova, Petra
    Mrazek, Vojtech
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 186 - 198
  • [47] Energy Propagation in Deep Convolutional Neural Networks
    Wiatowski, Thomas
    Grohs, Philipp
    Boelcskei, Helmut
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (07) : 4819 - 4842
  • [48] Efficient quantum state tomography with convolutional neural networks
    Schmale, Tobias
    Reh, Moritz
    Gaerttner, Martin
    NPJ QUANTUM INFORMATION, 2022, 8 (01)
  • [49] An Efficient Accelerator with Winograd for Novel Convolutional Neural Networks
    Lin, Zhijian
    Zhang, Meng
    Weng, Dongpeng
    Liu, Fei
    2022 5TH INTERNATIONAL CONFERENCE ON CIRCUITS, SYSTEMS AND SIMULATION (ICCSS 2022), 2022, : 126 - 130
  • [50] Memory Efficient Binary Convolutional Neural Networks on Microcontrollers
    Sakr, Fouad
    Berta, Riccardo
    Doyle, Joseph
    Younes, Hamoud
    De Gloria, Alessandro
    Bellotti, Francesco
    2022 IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING & COMMUNICATIONS (IEEE EDGE 2022), 2022, : 169 - 177