Learning Resource Allocation and Pricing for Cloud Profit Maximization

被引:0
|
作者
Du, Bingqian [1 ]
Wu, Chuan [1 ]
Huang, Zhiyi [1 ]
机构
[1] Univ Hong Kong, Hong Kong, Peoples R China
来源
THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2019年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cloud computing has been widely adopted to support various computation services. A fundamental problem faced by cloud providers is how to efficiently allocate resources upon user requests and price the resource usage, in order to maximize resource efficiency and hence provider profit. Existing studies establish detailed performance models of cloud resource usage, and propose offline or online algorithms to decide allocation and pricing. Differently, we adopt a black-box approach, and leverage model-free Deep Reinforcement Learning (DRL) to capture dynamics of cloud users and better characterize inherent connections between an optimal allocation/pricing policy and the states of the dynamic cloud system. The goal is to learn a policy that maximizes net profit of the cloud provider through trial and error, which is better than decisions made on explicit performance models. We combine long short-term memory (LSTM) units with fully-connected neural networks in our DRL to deal with online user arrivals, and adjust the output and update methods of basic DRL algorithms to address both resource allocation and pricing. Evaluation based on real-world datasets shows that our DRL approach outperforms basic DRL algorithms and state-of-theart white-box online cloud resource allocation/pricing algorithms significantly, in terms of both profit and the number of accepted users.
引用
收藏
页码:7570 / 7577
页数:8
相关论文
共 50 条
  • [41] Task allocation based on profit maximization for mobile crowdsourcing
    Yinglin H.
    Weiqing C.
    Journal of China Universities of Posts and Telecommunications, 2020, 27 (01): : 26 - 37
  • [42] Customer Adaptive Resource Provisioning for Long-Term Cloud Profit Maximization under Constrained Budget
    Cong, Peijin
    Zhang, Zhixing
    Zhou, Junlong
    Liu, Xin
    Liu, Yao
    Wei, Tongquan
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (06) : 1373 - 1392
  • [43] Profit Maximization for Video Caching and Processing in Edge Cloud
    Hao, Yixue
    Hu, Long
    Qian, Yongfeng
    Chen, Min
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2019, 37 (07) : 1632 - 1641
  • [45] A GLOBAL SUPPLY CHAIN PROFIT MAXIMIZATION AND TRANSFER PRICING MODEL
    Miller, Tan
    de Matta, Renato
    JOURNAL OF BUSINESS LOGISTICS, 2008, 29 (01) : 175 - +
  • [46] A Profit Maximization Scheme in Cloud Computing With Deadline Constraints
    Chen, Siyi
    Huang, Sining
    Luo, Qiang
    Zhou, Jialing
    IEEE ACCESS, 2020, 8 : 118924 - 118939
  • [47] Optimal Multiserver Configuration for Profit Maximization in Cloud Computing
    Cao, Junwei
    Hwang, Kai
    Li, Keqin
    Zomaya, Albert Y.
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2013, 24 (06) : 1087 - 1096
  • [48] A Machine Learning Framework for Resource Allocation Assisted by Cloud Computing
    Wang, Jun-Bo
    Wang, Junyuan
    Wu, Yongpeng
    Wang, Jin-Yuan
    Zhu, Huiling
    Lin, Min
    Wang, Jiangzhou
    IEEE NETWORK, 2018, 32 (02): : 144 - 151
  • [49] Machine Learning Approaches for Resource Allocation in the Cloud: Critical Reflections
    Murali, Akarsh
    Das, Nibir Nirjas
    Sukumaran, Shiv Shankar
    Chandrasekaran, K.
    Joseph, Christina
    Martin, John Paul
    2018 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2018, : 2073 - 2079
  • [50] ReCARL: Resource Allocation in Cloud RANs With Deep Reinforcement Learning
    Xu, Zhiyuan
    Tang, Jian
    Yin, Chengxiang
    Wang, Yanzhi
    Xue, Guoliang
    Wang, Jing
    Gursoy, M. Cenk
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2022, 21 (07) : 2533 - 2545