Spatial Data Dependence Graph Based Pre-RTL Simulator for Convolutional Neural Network Dataflows

被引:4
|
作者
Wang, Jooho [1 ]
Park, Sungkyung [2 ]
Park, Chester Sungchung [1 ]
机构
[1] Konkuk Univ, Dept Elect & Elect Engn, Seoul 05029, South Korea
[2] Pusan Natl Univ, Dept Elect Engn, Pusan 46241, South Korea
关键词
Hardware acceleration; Memory management; Convolutional neural networks; Bandwidth; Spatial databases; Registers; Power demand; Convolutional neural networks (CNNs); data dependence graph; design space exploration (DSE); hardware accelerators; latency-insensitive controller; pre-RTL simulator; spatial data dependence graph (SDDG); ARCHITECTURE; PERFORMANCE; INFERENCE; COST; DRAM;
D O I
10.1109/ACCESS.2022.3146413
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, a new pre-RTL simulator is proposed to predict the power, performance, and area of convolutional neural network (CNN) dataflows prior to register-transfer-level (RTL) design. In the simulator, a novel approach is adopted to implement a spatial data dependence graph (SDDG), which enables us to model a specific dataflow alongside inter-instruction dependencies by tracking the status of each processing element (PE). In addition, the proposed pre-RTL simulator makes it possible to evaluate the impact of memory constraints such as latency and bandwidth. The latency-insensitive and bandwidth-insensitive PE controllers assumed in the proposed pre-RTL simulator guarantee both functional correctness and maximum performance, regardless of memory constraints. In particular, it is shown that the optimal distribution method of local memory bandwidth can reduce the accelerator execution time by up to 37.6% compared with the equal distribution method. For weight stationary (WS) and row stationary (RS) dataflows, the accelerator performance closely depends on memory constraints. The simulation results also show that the relative performances of dataflows depend on the layer shape of the convolutional layer. For example, for an identical hardware area in a standard convolutional layer of AlexNet, WS dataflows do not provide any performance gain over RS dataflows when the memory latency is sufficiently high. In addition, WS dataflows cannot fully reuse the input activation, thereby increasing local memory accesses, since the number of weights loaded at a specific time is limited. Moreover, in a depth-wise convolutional layer of MobileNet, WS dataflows tend to outperform RS dataflows even in the presence of large memory latency. The source code is available on the GitHub repository: https://github.com/SDL-KU/SDDGSim.
引用
收藏
页码:11382 / 11403
页数:22
相关论文
共 50 条
  • [1] A Pre-RTL Simulator for Neural Networks
    Cao, Shan
    Bao, Zhenyi
    Xue, Chengbo
    Deng, Wei
    Xu, Shugong
    Zhang, Shunqing
    2019 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2019,
  • [2] Spatial Data Dependence Graph Simulator for Convolutional Neural Network Accelerators
    Wang, Jooho
    Kim, Jiwon
    Moon, Sungmin
    Kim, Sunwoo
    Park, Sungkyung
    Park, Chester Sungchung
    2019 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2019), 2019, : 309 - 310
  • [3] Gesture recognition of graph convolutional neural network based on spatial domain
    Chen, Hong
    Zhao, Hongdong
    Qi, Baoqiang
    Zhang, Shuai
    Yu, Zhanghong
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (03): : 2157 - 2167
  • [4] Gesture recognition of graph convolutional neural network based on spatial domain
    Hong Chen
    Hongdong Zhao
    Baoqiang Qi
    Shuai Zhang
    Zhanghong Yu
    Neural Computing and Applications, 2023, 35 : 2157 - 2167
  • [5] Convolutional Neural Network Outperforms Graph Neural Network on the Spatially Variant Graph Data
    Boronina, Anna
    Maksimenko, Vladimir
    Hramov, Alexander E. E.
    MATHEMATICS, 2023, 11 (11)
  • [6] Data Privacy Protection Model Based on Graph Convolutional Neural Network
    Gu, Tao
    Yang, Lin
    Wang, Hua
    MOBILE NETWORKS & APPLICATIONS, 2023, 29 (5): : 1433 - 1440
  • [7] A graph convolutional neural network for classification of building patterns using spatial vector data
    Yan, Xiongfeng
    Ai, Tinghua
    Yang, Min
    Yin, Hongmei
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2019, 150 : 259 - 273
  • [8] A Convolutional Neural Network and Graph Convolutional Network Based Framework for AD Classification
    Lin, Lan
    Xiong, Min
    Zhang, Ge
    Kang, Wenjie
    Sun, Shen
    Wu, Shuicai
    SENSORS, 2023, 23 (04)
  • [9] Course Recommendation Based on Graph Convolutional Neural Network
    An Cong Tran
    Duc-Thien Tran
    Nguyen Thai-Nghe
    Tran Thanh Dien
    Hai Thanh Nguyen
    ADVANCES AND TRENDS IN ARTIFICIAL INTELLIGENCE. THEORY AND APPLICATIONS, IEA/AIE 2023, PT I, 2023, 13925 : 235 - 240
  • [10] Mining the Graph Representation of Traffic Speed Data for Graph Convolutional Neural Network
    Mao, Jiannan
    Huang, Hao
    Chen, Yuting
    Lu, Weike
    Chen, Guoqiang
    Liu, Lan
    2021 IEEE INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE (ITSC), 2021, : 1205 - 1210