Creating Robust Deep Neural Networks With Coded Distributed Computing for IoT

被引:1
|
作者
Hadidi, Ramyad [1 ]
Cao, Jiashen [2 ]
Asgari, Bahar [2 ,3 ]
Kim, Hyesoon [2 ]
机构
[1] Rain AI, Atlanta, GA 30332 USA
[2] Georgia Tech, Atlanta, GA USA
[3] Univ Maryland, College Pk, MD USA
关键词
Edge AI; Reliability; IoT; Edge; Distributed Computing; Collaborative Edge & Robotics;
D O I
10.1109/EDGE60047.2023.00029
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The increasing interest in serverless computation and ubiquitous wireless networks has led to numerous connected devices in our surroundings. Such IoT devices have access to an abundance of raw data, but their inadequate resources in computing limit their capabilities. With the emergence of deep neural networks (DNNs), the demand for the computing power of IoT devices is increasing. To overcome inadequate resources, several studies have proposed distribution methods for IoT devices that harvest the aggregated computing power of idle IoT devices in an environment. However, since such a distributed system strongly relies on each device, unstable latency, and intermittent failures, the common characteristics of IoT devices and wireless networks, cause high recovery overheads. To reduce this overhead, we propose a novel robustness method with a close-to-zero recovery latency for DNN computations. Our solution never loses a request or spends time recovering from a failure. To do so, first, we analyze how matrix computations in DNNs are affected by distribution. Then, we introduce a novel coded distributed computing (CDC) method, the cost of which, unlike that of modular redundancies, is constant when the number of devices increases. Our method is applied at the library level, without requiring extensive changes to the program, while still ensuring a balanced work assignment during distribution.
引用
收藏
页码:126 / 132
页数:7
相关论文
共 50 条
  • [31] Solve Game of the Amazons with Neural Networks and Distributed Computing
    Chang, Yuanxing
    Qin, Hongkun
    PROCEEDINGS OF THE 30TH CHINESE CONTROL AND DECISION CONFERENCE (2018 CCDC), 2018, : 6627 - 6632
  • [32] Deep distributed convolutional neural networks: Universality
    Zhou, Ding-Xuan
    ANALYSIS AND APPLICATIONS, 2018, 16 (06) : 895 - 919
  • [33] Robust Test Selection for Deep Neural Networks
    Sun, Weifeng
    Yan, Meng
    Liu, Zhongxin
    Lo, David
    IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, 2023, 49 (12) : 5250 - 5278
  • [34] Robust Large Margin Deep Neural Networks
    Sokolic, Jure
    Giryes, Raja
    Sapiro, Guillermo
    Rodrigues, Miguel R. D.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (16) : 4265 - 4280
  • [35] Robust learning of parsimonious deep neural networks
    Guenter, Valentin Frank Ingmar
    Sideris, Athanasios
    NEUROCOMPUTING, 2024, 566
  • [36] Towards robust explanations for deep neural networks
    Dombrowski, Ann-Kathrin
    Anders, Christopher J.
    Mueller, Klaus-Robert
    Kessel, Pan
    PATTERN RECOGNITION, 2022, 121
  • [37] Towards Robust Deep Neural Networks with BANG
    Rozsa, Andras
    Gunther, Manuel
    Boult, Terrance E.
    2018 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2018), 2018, : 803 - 811
  • [38] Distributed Newton Methods for Deep Neural Networks
    Wang, Chien-Chih
    Tan, Kent Loong
    Chen, Chun-Ting
    Lin, Yu-Hsiang
    Keerthi, S. Sathiya
    Mahajan, Dhruv
    Sundararajan, S.
    Lin, Chih-Jen
    NEURAL COMPUTATION, 2018, 30 (06) : 1673 - 1724
  • [39] Quality Robust Mixtures of Deep Neural Networks
    Dodge, Samuel F.
    Karam, Lina J.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (11) : 5553 - 5562
  • [40] A Cross-Layer Optimization Framework for Distributed Computing in IoT Networks
    Shang, Bodong
    Liu, Shiya
    Lu, Sidi
    Yi, Yang
    Shi, Weisong
    Liu, Lingjia
    2020 IEEE/ACM SYMPOSIUM ON EDGE COMPUTING (SEC 2020), 2020, : 440 - 444