QOS-AWARE FLOW CONTROL FOR POWER-EFFICIENT DATA CENTER NETWORKS WITH DEEP REINFORCEMENT LEARNING

被引:0
|
作者
Sun, Penghao [1 ]
Guo, Zehua [2 ]
Liu, Sen [3 ]
Lan, Julong [1 ]
Hu, Yuxiang [1 ]
机构
[1] Natl Digital Switching Syst Engn & Technol R&D Ct, Zhengzhou, Peoples R China
[2] Beijing Inst Technol, Beijing, Peoples R China
[3] Cent South Univ, Changsha, Hunan, Peoples R China
关键词
Data center network; Software-defined networking; Deep reinforcement learning; Power efficiency;
D O I
10.1109/icassp40776.2020.9054040
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Reducing the power consumption and maintaining the Flow Completion Time (FCT) for the Quality of Service (QoS) of applications in Data Center Networks (DCNs) are two major concerns for data center operators. However, existing works either fail in guaranteeing the QoS due to the neglect of the FCT constraints or achieve a less satisfying power efficiency. In this paper, we propose SmartFCT, which employs Software-Defined Networking (SDN) coupled with the Deep Reinforcement Learning (DRL) to improve the power efficiency of DCNs and guarantee the FCT. The DRL agent can generate a dynamic policy to consolidate traffic flows into fewer active switches in the DCN for power efficiency, and the policy also leaves different margins in different active links and switches to avoid FCT violation of unexpected short bursts of flows. Simulation results show that with similar FCT guarantee, SmartFCT can save 8% more of the power consumption compared to the state-of-the-art solutions.
引用
收藏
页码:3552 / 3556
页数:5
相关论文
共 50 条
  • [1] QoS-aware data center network reconfiguration method based on deep reinforcement learning
    Guo, Xiaotao
    Yan, Fulong
    Xue, Xuwei
    Pan, Bitao
    Exarchakos, George
    Calabretta, Nicola
    JOURNAL OF OPTICAL COMMUNICATIONS AND NETWORKING, 2021, 13 (05) : 94 - 107
  • [2] QoS-Aware Joint Offloading and Power Control Using Deep Reinforcement Learning in MEC
    Li, Xiang
    Chen, Yu
    2020 23RD INTERNATIONAL SYMPOSIUM ON WIRELESS PERSONAL MULTIMEDIA COMMUNICATIONS (WPMC 2020), 2020,
  • [3] QoS-Aware Power-Efficient Scheduler for LTE Uplink
    Kalil, Mohamad
    Shami, Abdallah
    Al-Dweik, Arafat
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2015, 14 (08) : 1672 - 1685
  • [4] QoS-Aware Power Management with Deep Learning
    Zhou, Junxiu
    Liu, Xian
    Tao, Yangyang
    Yu, Shucheng
    2019 IFIP/IEEE SYMPOSIUM ON INTEGRATED NETWORK AND SERVICE MANAGEMENT (IM), 2019, : 289 - 294
  • [5] Power-efficient and QoS-aware scheduling in Bluetooth scatternet for wireless PANs
    Joo, YI
    Lee, TJ
    Eom, DS
    Lee, YW
    Tchah, KH
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2003, 49 (04) : 1067 - 1072
  • [6] Dynamic Flow Scheduling for Power-Efficient Data Center Networks
    Guo, Zehua
    Hui, Shufeng
    Xu, Yang
    Chao, H. Jonathan
    2016 IEEE/ACM 24TH INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE (IWQOS), 2016,
  • [7] IQoR: An Intelligent QoS-aware Routing Mechanism with Deep Reinforcement Learning
    Cao, Yuanyuan
    Dai, Bin
    Mo, Yijun
    Xu, Yang
    PROCEEDINGS OF THE 2020 IEEE 45TH CONFERENCE ON LOCAL COMPUTER NETWORKS (LCN 2020), 2020, : 329 - 332
  • [8] QoS-Aware Scheduling in New Radio Using Deep Reinforcement Learning
    Stigenberg, Jakob
    Saxena, Vidit
    Tayamon, Soma
    Ghadimi, Euhanna
    2021 IEEE 32ND ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2021,
  • [9] A Reinforcement Learning QoI/QoS-Aware Approach in Acoustic Sensor Networks
    Afifi, Haitham
    Ramaswamy, Arunselvan
    Karl, Holger
    2021 IEEE 18TH ANNUAL CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE (CCNC), 2021,
  • [10] Power-Aware Traffic Engineering for Data Center Networks via Deep Reinforcement Learning
    Gao, Minglan
    Pan, Tian
    Song, Enge
    Yang, Mengqi
    Huang, Tao
    Liu, Yunjie
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 6055 - 6060