Edge/Cloud Infinite-Time Horizon Resource Allocation for Distributed Machine Learning and General Tasks

被引:1
|
作者
Sartzetakis, Ippokratis [1 ,2 ]
Soumplis, Polyzois [1 ,2 ]
Pantazopoulos, Panagiotis [2 ]
Katsaros, Konstantinos V. [2 ]
Sourlas, Vasilis [2 ]
Varvarigos, Emmanouel [1 ,2 ]
机构
[1] Natl Tech Univ Athens, Sch Elect & Comp Engn, Athens 15773, Greece
[2] Natl Tech Univ Athens, Inst Commun & Comp Syst, Athens 15773, Greece
基金
欧盟地平线“2020”;
关键词
Cloud and edge computing; distributed computing; distributed machine learning; inference; training; resource allocation; INTERNET; IOT;
D O I
10.1109/TNSM.2023.3312593
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Edge computing has emerged as a computing paradigm where the application and data processing takes place close to the end devices. It decreases the distances over which data transfers are made, offering reduced delay and fast speed of action for general data processing and store/retrieve jobs. The benefits of edge computing can also be reaped for distributed computation algorithms, where the cloud also plays an assistive role. In this context, an important challenge is to allocate the required resources at both edge and cloud to carry out the processing of data that are generated over a continuous ("infinite") time horizon. This is a complex problem due to the variety of requirements (resource needs, accuracy, delay, etc.) that may be posed by each computation algorithm, as well as the heterogeneous resources' features (e.g., processing, bandwidth). In this work, we develop a solution for serving weakly coupled general distributed algorithms, with emphasis on machine learning algorithms, at the edge and/or the cloud. We present a dual-objective Integer Linear Programming formulation that optimizes monetary cost and computation accuracy. We also introduce efficient heuristics to perform the resource allocation. We examine various distributed ML allocation scenarios using realistic parameters from actual vendors. We quantify trade-offs related to accuracy, performance and cost of edge/cloud bandwidth and processing resources. Our results indicate that among the many parameters of interest, the processing costs seem to play the most important role for the allocation decisions. Finally, we explore interesting interactions between target accuracy, monetary cost and delay.
引用
收藏
页码:697 / 713
页数:17
相关论文
共 50 条
  • [1] Resource Allocation for Distributed Machine Learning at the Edge-Cloud Continuum
    Sartzetakis, Ippokratis
    Soumplis, Polyzois
    Pantazopoulos, Panagiotis
    Katsaros, Konstantinos V.
    Sourlas, Vasilis
    Varvarigos, Emmanouel
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 5017 - 5022
  • [2] Joint Data Collection and Resource Allocation for Distributed Machine Learning at the Edge
    Chen, Min
    Wang, Haichuan
    Meng, Zeyu
    Xu, Hongli
    Xu, Yang
    Liu, Jianchun
    Huang, He
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2022, 21 (08) : 2876 - 2894
  • [3] Tasks scheduling and resource allocation in distributed cloud environments
    Uskenbayeva, R. K.
    Kuandykov, A. A.
    Cho, Y., I
    Kalpeyeva, Zh. B.
    2014 14TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2014), 2014, : 1373 - 1376
  • [4] Cloud Resource Allocation Recommendation Based on Machine Learning
    Semwar, Arpit
    Yue, Xiaofeng
    Shen, Yuzhe
    Aibin, Michal
    2024 24TH INTERNATIONAL CONFERENCE ON TRANSPARENT OPTICAL NETWORKS, ICTON 2024, 2024,
  • [5] Machine learning for dynamic resource allocation at network edge
    Ko, Bong Jun
    Leung, Kin K.
    Salonidis, Theodoros
    GROUND/AIR MULTISENSOR INTEROPERABILITY, INTEGRATION, AND NETWORKING FOR PERSISTENT ISR IX, 2018, 10635
  • [6] Blockchain based resource allocation in cloud and distributed edge computing: A survey
    Baranwal, Gaurav
    Kumar, Dinesh
    Vidyarthi, Deo Prakash
    COMPUTER COMMUNICATIONS, 2023, 209 : 469 - 498
  • [7] Resource Allocation for Real-Time Tasks using Cloud Computing
    Kumar, Karthik
    Feng, Jing
    Nimmagadda, Yamini
    Lu, Yung-Hsiang
    2011 20TH INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATIONS AND NETWORKS (ICCCN), 2011,
  • [8] A Machine Learning Framework for Resource Allocation Assisted by Cloud Computing
    Wang, Jun-Bo
    Wang, Junyuan
    Wu, Yongpeng
    Wang, Jin-Yuan
    Zhu, Huiling
    Lin, Min
    Wang, Jiangzhou
    IEEE NETWORK, 2018, 32 (02): : 144 - 151
  • [9] Machine Learning Approaches for Resource Allocation in the Cloud: Critical Reflections
    Murali, Akarsh
    Das, Nibir Nirjas
    Sukumaran, Shiv Shankar
    Chandrasekaran, K.
    Joseph, Christina
    Martin, John Paul
    2018 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2018, : 2073 - 2079
  • [10] Machine Learning Based Resource Allocation of Cloud Computing in Auction
    Zhang, Jixian
    Xie, Ning
    Zhang, Xuejie
    Yue, Kun
    Li, Weidong
    Kumar, Deepesh
    CMC-COMPUTERS MATERIALS & CONTINUA, 2018, 56 (01): : 123 - 135