High efficient training method of MiniGo on large-scale heterogeneous computing platform

被引:0
|
作者
Li, Rongchun [1 ]
He, Zhouyu [1 ]
Qiao, Peng [1 ]
Jiang, Jingfei [1 ]
Dou, Yong [1 ]
Li, Dongsheng [1 ]
机构
[1] National Key Laboratory of Parallel and Distributed Computing, National University of Defense Technology, Changsha,410073, China
关键词
An efficient multi-level parallel training method suitable for training MiniGo agents on large-scale heterogeneous computing platforms was proposed; including task level parallelism between nodes; CPU-DSP (central processing unit-digital signal process) heterogeneous parallelism and DSP core parallelism. Efficient input/output deployment and eliminated the bottleneck of network communication were realized. A heterogeneous computing memory management oriented to CPU-DSP shared memory structure was proposed to reduce the data handling between heterogeneous devices. Shared memory programming optimization was realized; and the dense convolution calculation operator acceleration optimization was realized by DSP. Results show that compared with 16 core CPU calculation; the maximum acceleration ratio of single core DSP operator acceleration is 16. 44. In this method; the scale of computing nodes is expanded from 1 067 to 4 139; the time required to reach the given termination condition is reduced from 43. 02 h to 16. 05 h; and the expansion efficiency is 69. 1%. Evaluation shows that this method can realize the efficient parallel training of MiniGo on large-scale heterogeneous computing platforms. © 2024 National University of Defense Technology. All rights reserved;
D O I
10.11887/j.cn.202405022
中图分类号
学科分类号
摘要
引用
收藏
页码:209 / 218
相关论文
共 50 条
  • [1] MLPs: Efficient Training of MiniGo on Large-scale Heterogeneous Computing System
    Qiao, Peng
    He, Zhouyu
    Li, Rongchun
    Jiang, Jingfei
    Dou, Yong
    Li, Dongsheng
    2022 IEEE 28TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, ICPADS, 2022, : 475 - 482
  • [2] A CLOUD COMPUTING PLATFORM FOR LARGE-SCALE FORENSIC COMPUTING
    Roussev, Vassil
    Wang, Liqiang
    Richard, Golden
    Marziale, Lodovico
    ADVANCES IN DIGITAL FORENSICS V, 2009, 306 : 201 - 214
  • [3] A Search Method of Large-Scale Resources for Providing Efficient Computing on a Participating Fine-Granular Cloud Computing Platform
    Nishii, Kento
    Tanigawa, Yosuke
    Tode, Hideki
    2018 IEEE 7TH INTERNATIONAL CONFERENCE ON CLOUD NETWORKING (CLOUDNET), 2018,
  • [4] Efficient Integration Method of Large-Scale Heterogeneous Security Logs Using NoSQL in Cloud Computing Environment
    Jeong, Huijin
    Piao, Xuefeng
    Choi, Junho
    Shin, Juhyun
    Kim, Pankoo
    JOURNAL OF INTERNET TECHNOLOGY, 2016, 17 (02): : 267 - 275
  • [5] A Music Recommendation Method for Large-Scale Music Library on a Heterogeneous Platform
    Zheng, Yao
    Xiao, Limin
    Tang, Wenqi
    Ruan, Li
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2014, PT I, 2014, 8630 : 472 - 482
  • [6] Advanced learning for large-scale heterogeneous computing
    Zou, Quan
    Liu, Wei
    Merler, Michele
    Ji, Rongrong
    NEUROCOMPUTING, 2016, 217 : 1 - 2
  • [7] An efficient method for computing response functions for large-scale vibrational systems
    Terao, T
    Nakayama, T
    PHYSICA B-CONDENSED MATTER, 1996, 219-20 : 357 - 360
  • [8] GridGAS: An I/O-Efficient Heterogeneous FPGA plus CPU Computing Platform for Very Large-Scale Graph Analytics
    Zou, Yu
    Lin, Mingjie
    2018 INTERNATIONAL CONFERENCE ON FIELD-PROGRAMMABLE TECHNOLOGY (FPT 2018), 2018, : 249 - 252
  • [9] Large-Scale Feature Matching with Distributed and Heterogeneous Computing
    Mills, Steven
    Eyers, David
    Leung, Kai-Cheung
    Tang, Xiaoxin
    Huang, Zhiyi
    PROCEEDINGS OF 2013 28TH INTERNATIONAL CONFERENCE ON IMAGE AND VISION COMPUTING NEW ZEALAND (IVCNZ 2013), 2013, : 208 - 213
  • [10] Disco: A Computing Platform for Large-Scale Data Analytics
    Mundkur, Prashanth
    Tuulos, Ville
    Flatow, Jared
    ERLANG 11: PROCEEDINGS OF THE 2011 ACM SIGPLAN ERLANG WORKSHOP, 2011, : 84 - 89