Creating Robust Deep Neural Networks With Coded Distributed Computing for IoT

被引:1
|
作者
Hadidi, Ramyad [1 ]
Cao, Jiashen [2 ]
Asgari, Bahar [2 ,3 ]
Kim, Hyesoon [2 ]
机构
[1] Rain AI, Atlanta, GA 30332 USA
[2] Georgia Tech, Atlanta, GA USA
[3] Univ Maryland, College Pk, MD USA
关键词
Edge AI; Reliability; IoT; Edge; Distributed Computing; Collaborative Edge & Robotics;
D O I
10.1109/EDGE60047.2023.00029
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The increasing interest in serverless computation and ubiquitous wireless networks has led to numerous connected devices in our surroundings. Such IoT devices have access to an abundance of raw data, but their inadequate resources in computing limit their capabilities. With the emergence of deep neural networks (DNNs), the demand for the computing power of IoT devices is increasing. To overcome inadequate resources, several studies have proposed distribution methods for IoT devices that harvest the aggregated computing power of idle IoT devices in an environment. However, since such a distributed system strongly relies on each device, unstable latency, and intermittent failures, the common characteristics of IoT devices and wireless networks, cause high recovery overheads. To reduce this overhead, we propose a novel robustness method with a close-to-zero recovery latency for DNN computations. Our solution never loses a request or spends time recovering from a failure. To do so, first, we analyze how matrix computations in DNNs are affected by distribution. Then, we introduce a novel coded distributed computing (CDC) method, the cost of which, unlike that of modular redundancies, is constant when the number of devices increases. Our method is applied at the library level, without requiring extensive changes to the program, while still ensuring a balanced work assignment during distribution.
引用
收藏
页码:126 / 132
页数:7
相关论文
共 50 条
  • [21] Stream Distributed Coded Computing
    Cohen A.
    Thiran G.
    Esfahanizadeh H.
    Medard M.
    IEEE Journal on Selected Areas in Information Theory, 2021, 2 (03): : 1025 - 1040
  • [22] On Heterogeneous Coded Distributed Computing
    Kiamari, Mehrdad
    Wang, Chenwei
    Avestimehr, A. Salman
    GLOBECOM 2017 - 2017 IEEE GLOBAL COMMUNICATIONS CONFERENCE, 2017,
  • [23] Block Allocation of Systematic Coded Distributed Computing in Heterogeneous Straggling Networks
    Wang, Yu
    Gu, Shushi
    Zhang, Zhikai
    Zhang, Qinyu
    Xiang, Wei
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1066 - 1071
  • [24] Efficient Deep Neural Networks for Edge Computing
    Alnemari, Mohammed
    Bagherzadeh, Nader
    2019 IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING (IEEE EDGE), 2019, : 1 - 7
  • [25] Editorial: Deep neural networks with cloud computing
    Chan, Kit Yan
    Abu-Salih, Bilal
    Muhammad, Khan
    Palade, Vasile
    Chai, Rifai
    NEUROCOMPUTING, 2023, 521 : 189 - 190
  • [26] Computing Topological Invariants of Deep Neural Networks
    Zhang, Xiujun
    Idrees, Nazeran
    Kanwal, Salma
    Saif, Muhammad Jawwad
    Saeed, Fatima
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [27] Performance analysis of local exit for distributed deep neural networks over cloud and edge computing
    Lee, Changsik
    Hong, Seungwoo
    Hong, Sungback
    Kim, Taeyeon
    ETRI JOURNAL, 2020, 42 (05) : 658 - 668
  • [28] Creating a Resilient IoT With Edge Computing
    Kim, Hokeun
    Lee, Edward A.
    Dustdar, Schahram
    COMPUTER, 2019, 52 (08) : 43 - 53
  • [29] Accelerating Neural BP-Based Decoder Using Coded Distributed Computing
    Han, Xuesong
    Liu, Rui
    Li, Yong
    Yi, Chen
    He, Jiguang
    Wang, Ming
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (09) : 13997 - 14002
  • [30] Research on the performance of grid computing for distributed neural networks
    Bo, Yang
    Xun, Wang
    INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2006, 6 (04): : 179 - 187