QOS-AWARE FLOW CONTROL FOR POWER-EFFICIENT DATA CENTER NETWORKS WITH DEEP REINFORCEMENT LEARNING

被引:0
|
作者
Sun, Penghao [1 ]
Guo, Zehua [2 ]
Liu, Sen [3 ]
Lan, Julong [1 ]
Hu, Yuxiang [1 ]
机构
[1] Natl Digital Switching Syst Engn & Technol R&D Ct, Zhengzhou, Peoples R China
[2] Beijing Inst Technol, Beijing, Peoples R China
[3] Cent South Univ, Changsha, Hunan, Peoples R China
来源
2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING | 2020年
关键词
Data center network; Software-defined networking; Deep reinforcement learning; Power efficiency;
D O I
10.1109/icassp40776.2020.9054040
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Reducing the power consumption and maintaining the Flow Completion Time (FCT) for the Quality of Service (QoS) of applications in Data Center Networks (DCNs) are two major concerns for data center operators. However, existing works either fail in guaranteeing the QoS due to the neglect of the FCT constraints or achieve a less satisfying power efficiency. In this paper, we propose SmartFCT, which employs Software-Defined Networking (SDN) coupled with the Deep Reinforcement Learning (DRL) to improve the power efficiency of DCNs and guarantee the FCT. The DRL agent can generate a dynamic policy to consolidate traffic flows into fewer active switches in the DCN for power efficiency, and the policy also leaves different margins in different active links and switches to avoid FCT violation of unexpected short bursts of flows. Simulation results show that with similar FCT guarantee, SmartFCT can save 8% more of the power consumption compared to the state-of-the-art solutions.
引用
收藏
页码:3552 / 3556
页数:5
相关论文
共 50 条
  • [41] Deep Reinforcement Learning for Energy-Efficient Power Control in Heterogeneous Networks
    Peng, Jianhao
    Zheng, Jiabao
    Zhang, Lin
    Xiao, Ming
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 141 - 146
  • [42] QoS-Aware Task Offloading in Fog Environment Using Multi-agent Deep Reinforcement Learning
    Vibha Jain
    Bijendra Kumar
    Journal of Network and Systems Management, 2023, 31
  • [43] PPDRL: A Pretraining-and-Policy-Based Deep Reinforcement Learning Approach for QoS-Aware Service Composition
    Yi, Kan
    Yang, Jin
    Wang, Shuangling
    Zhang, Zhengtong
    Ren, Xiao
    SECURITY AND COMMUNICATION NETWORKS, 2022, 2022
  • [44] QoS-Aware Autonomous Distributed Power Control in Co-Channel Femtocell Networks
    Chakchouk, Nessrine
    Hamdaoui, Bechir
    2012 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2012, : 567 - 571
  • [45] QoS-Aware Power Allocation Scheme for Relay Satellite Networks
    Jin'ao Yu
    Xiaogang Tang
    Shibing Zhu
    Haopeng Sun
    Lunxin Zhong
    Guangyu Yang
    Journal of Beijing Institute of Technology, 2021, 30 (01) : 82 - 90
  • [46] QoS-aware cooperative power control and resource allocation scheme in LTE femtocell networks
    Wang, Chiapin
    Kuo, Wen-Hsing
    Chu, Chun-Yu
    COMPUTER COMMUNICATIONS, 2017, 110 : 164 - 174
  • [47] μ-DDRL: A QoS-Aware Distributed Deep Reinforcement Learning Technique for Service Offloading in Fog Computing Environments
    Goudarzi M.
    Rodriguez M.A.
    Sarvi M.
    Buyya R.
    IEEE Transactions on Services Computing, 2024, 17 (01): : 47 - 59
  • [48] QoS-Aware Task Offloading in Fog Environment Using Multi-agent Deep Reinforcement Learning
    Jain, Vibha
    Kumar, Bijendra
    JOURNAL OF NETWORK AND SYSTEMS MANAGEMENT, 2023, 31 (01)
  • [49] QoS-aware power control and handoff prioritization in 3G WCDMA networks
    Rachidi, TE
    Elbatji, AY
    Sebbane, M
    Bouzekri, H
    2004 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, VOLS 1-4: BROADBAND WIRELESS - THE TIME IS NOW, 2004, : 665 - 670
  • [50] QoS-Aware Power Allocation Scheme for Relay Satellite Networks
    Yu J.
    Tang X.
    Zhu S.
    Sun H.
    Zhong L.
    Yang G.
    Journal of Beijing Institute of Technology (English Edition), 2021, 30 (01): : 82 - 90