aCortex: An Energy-Efficient Multipurpose Mixed-Signal Inference Accelerator

被引:10
|
作者
Bavandpour, Mohammad [1 ]
Mahmoodi, Mohammad R. [1 ]
Strukov, Dmitri B. [1 ]
机构
[1] Univ Calif Santa Barbara, Dept Elect & Comp Engn, Santa Barbara, CA 93117 USA
基金
美国国家科学基金会;
关键词
Artificial neural networks; floating-gate memory; machine learning; mixed-signal circuits; neuromorphic inference accelerator; nonvolatile memory (NVM); ANALOG;
D O I
10.1109/JXCDC.2020.2999581
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We introduce "aCortex," an extremely energy-efficient, fast, compact, and versatile neuromorphic processor architecture suitable for the acceleration of a wide range of neural network inference models. The most important feature of our processor is a configurable mixed-signal computing array of vector-by-matrix multiplier (VMM) blocks utilizing embedded nonvolatile memory arrays for storing weight matrices. Analog peripheral circuitry for data conversion and high-voltage programming are shared among a large array of VMM blocks to facilitate compact and energy-efficient analog-domain VMM operation of different types of neural network layers. Other unique features of aCortex include configurable chain of buffers and data buses, simple and efficient instruction set architecture and its corresponding multiagent controller, programmable quantization range, and a customized refresh-free embedded dynamic random access memory. The energy-optimal aCortex with 4-bit analog computing precision was designed in a 55-nm process with embedded NOR flash memory. Its physical performance was evaluated using experimental data from testing individual circuit elements and physical layout of key components for several common benchmarks, namely, Inception-vl and ResNet-152, two state-of-the-art deep feedforward networks for image classification, and GNTM, Google's deep recurrent network for language translation. The system-level simulation results for these benchmarks show the energy efficiency of 97, 106, and 336 TOp/J, respectively, combined with up to 15 TOp/s computing throughput and 0.27-MB/mm(2) storage efficiency. Such estimated performance results compare favorably with those of previously reported mixed-signal accelerators based on much less mature aggressively scaled resistive switching memories.
引用
收藏
页码:98 / 106
页数:9
相关论文
共 50 条
  • [1] An Energy-Efficient Programmable Mixed Signal Accelerator for Machine Learning Algorithms
    Kang, Mingu
    Srivastava, Prakalp
    Adve, Vikram
    Kim, Nam Sung
    Shanbhag, Naresh R.
    IEEE MICRO, 2019, 39 (05) : 64 - 72
  • [2] A 17.5 fJ/bit Energy-efficient Analog SRAM for Mixed-signal Processing
    Lee, Jinsu
    Shin, Dongjoo
    Kim, Youchang
    Yoo, Hoi-Jun
    2016 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2016, : 1010 - 1013
  • [3] Optimization and evaluation of energy-efficient mixed-signal MFCC feature extraction architecture
    Zhang, Yanming
    Qiu, Xu
    Li, Qin
    Qiao, Fei
    Wei, Qi
    Luo, Li
    Yang, Huazhong
    2020 IEEE COMPUTER SOCIETY ANNUAL SYMPOSIUM ON VLSI (ISVLSI 2020), 2020, : 506 - 511
  • [4] An Energy-Efficient Mixed-Signal Neuron for Inherently Error-Resilient Neuromorphic Systems
    Chatterjee, Baibhab
    Panda, Priyadarshini
    Maity, Shovan
    Roy, Kaushik
    Sen, Shreyas
    2017 IEEE INTERNATIONAL CONFERENCE ON REBOOTING COMPUTING (ICRC), 2017, : 170 - 171
  • [5] Energy-Efficient CMOS Memristive Synapses for Mixed-Signal Neuromorphic System-on-a-Chip
    Saxena, Vishal
    Wu, Xinyu
    Zhu, Kehan
    2018 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2018,
  • [6] Energy-efficient MFCC extraction architecture in mixed-signal domain for automatic speech recognition
    Li, Qin
    Zhu, Huifeng
    Qiao, Fei
    Wei, Qi
    Liu, Xinjun
    Yang, Huazhong
    NANOARCH'18: PROCEEDINGS OF THE 14TH IEEE/ACM INTERNATIONAL SYMPOSIUM ON NANOSCALE ARCHITECTURES, 2018, : 138 - 140
  • [7] A 17.5-fJ/bit Energy-Efficient Analog SRAM for Mixed-Signal Processing
    Lee, Jinsu
    Shin, Dongjoo
    Kim, Youchang
    Yoo, Hoi-Jun
    IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2017, 25 (10) : 2714 - 2723
  • [8] Memristive Mixed-Signal Neuromorphic Systems: Energy-Efficient Learning at the Circuit-Level
    Chakma, Gangotree
    Adnan, Md Musabbir
    Wyer, Austin R.
    Weiss, Ryan
    Schuman, Catherine D.
    Rose, Garrett S.
    IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2018, 8 (01) : 125 - 136
  • [9] 55 nm CMOS Mixed-Signal Neuromorphic Circuits for Constructing Energy-Efficient Reconfigurable SNNs
    Quan, Jiale
    Liu, Zhen
    Li, Bo
    Zeng, Chuanbin
    Luo, Jiajun
    ELECTRONICS, 2023, 12 (19)
  • [10] An Energy-Efficient Mixed-Signal Parallel Multiply-Accumulate (MAC) Engine Based on Stochastic Computing
    Zhang, Xinyue
    Song, Jiahao
    Wang, Yuan
    Zhang, Yawen
    Zhang, Zuodong
    Wang, Runsheng
    Huang, Ru
    2019 IEEE 13TH INTERNATIONAL CONFERENCE ON ASIC (ASICON), 2019,