Recent progress in analog memory-based accelerators for deep learning

被引:152
|
作者
Tsai, Hsinyu [1 ]
Ambrogio, Stefano [1 ]
Narayanan, Pritish [1 ]
Shelby, Robert M. [1 ]
Burr, Geoffrey W. [1 ]
机构
[1] IBM Res Almaden, 650 Harry Rd, San Jose, CA 95120 USA
关键词
analog memory; non-volatile memory; hardware accelerators; deep learning; NEURAL-NETWORKS; OPTICAL IMPLEMENTATION; PHASE-CHANGE; HOPFIELD MODEL; DEVICES; SYNAPSE; DESIGN; STRATEGIES; SYSTEM; ARRAY;
D O I
10.1088/1361-6463/aac8a5
中图分类号
O59 [应用物理学];
学科分类号
摘要
We survey recent progress in the use of analog memory devices to build neuromorphic hardware accelerators for deep learning applications. After an overview of deep learning and the application opportunities for deep neural network (DNN) hardware accelerators, we briefly discuss the research area of customized digital accelerators for deep learning. We discuss how the strengths and weaknesses of analog memory-based accelerators match well to the weaknesses and strengths of digital accelerators, and attempt to identify where the future hardware opportunities might be found. We survey the extensive but rapidly developing literature on what would be needed from an analog memory device to enable such a DNN accelerator, and summarize progress with various analog memory candidates including non-volatile memory such as resistive RAM, phase change memory, Li-ion-based devices, capacitor-based and other CMOS devices, as well as photonics-based devices and systems. After surveying how recent circuits and systems work, we conclude with a description of the next research steps that will be needed in order to move closer to the commercialization of viable analog-memory-based DNN hardware accelerators.
引用
收藏
页数:27
相关论文
共 50 条
  • [1] Memory-based Deep Reinforcement Learning for POMDPs
    Meng, Lingheng
    Gorbet, Rob
    Kulic, Dana
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 5619 - 5626
  • [2] Learning with Memory-based Virtual Classes for Deep Metric Learning
    Ko, Byungsoo
    Gu, Geonmo
    Kim, Han-Gyu
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 11772 - 11781
  • [3] Analog-memory-based In-Memory-Computing Accelerators for Deep Neural Networks
    Tsai, Hsinyu
    2024 IEEE WORKSHOP ON MICROELECTRONICS AND ELECTRON DEVICES, WMED, 2024, : XIII - XIII
  • [4] The Progress and Trends of FPGA-Based Accelerators in Deep Learning
    Wu Y.-X.
    Liang K.
    Liu Y.
    Cui H.-M.
    Jisuanji Xuebao/Chinese Journal of Computers, 2019, 42 (11): : 2461 - 2480
  • [5] Study on LSTM and ConvLSTM Memory-Based Deep Reinforcement Learning
    Duarte, Fernando Fradique
    Lau, Nuno
    Pereira, Artur
    Reis, Luis Paulo
    AGENTS AND ARTIFICIAL INTELLIGENCE, ICAART 2023, 2024, 14546 : 223 - 243
  • [6] A THEORY FOR MEMORY-BASED LEARNING
    LIN, JH
    VITTER, JS
    MACHINE LEARNING, 1994, 17 (2-3) : 143 - 167
  • [7] Memory-based gaze prediction in deep imitation learning for robot manipulation
    Kim, Heecheol
    Ohmura, Yoshiyuki
    Kuniyoshi, Yasuo
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, : 2427 - 2433
  • [8] On Designing Efficient and Reliable Nonvolatile Memory-Based Computing-In-Memory Accelerators
    Yan, Bonan
    Liu, Mengyun
    Chen, Yiran
    Chakrabarty, Krishnendu
    Li, Hai
    2019 IEEE INTERNATIONAL ELECTRON DEVICES MEETING (IEDM), 2019,
  • [9] Scratchpad Memory Management for Deep Learning Accelerators
    Zouzoula, Stavroula
    Maleki, Mohammad Ali
    Azhar, Muhammad Waqar
    Trancoso, Pedro
    53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024, 2024, : 629 - 639
  • [10] RECENT PROGRESS IN ACCELERATORS
    CHU, EL
    SCHIFF, LI
    ANNUAL REVIEW OF NUCLEAR SCIENCE, 1953, 2 : 79 - 92