Optimal Management of Office Energy Consumption via Q-learning Algorithm

被引:0
|
作者
Shi, Guang [1 ]
Liu, Derong [2 ]
Wei, Qinglai [1 ]
机构
[1] Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100190, Peoples R China
[2] Univ Sci & Technol Beijing, Sch Automat & Elect Engn, Beijing 100083, Peoples R China
基金
中国国家自然科学基金;
关键词
ECHO STATE NETWORK; RECOGNITION; SYSTEMS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, a Q-learning based algorithm is developed to optimize energy consumption in an office, where solar energy is introduced as the renewable source and a battery is installed as the control unit. The energy consumption in the office, regarded as the energy demand, is divided into those from sockets, lights and air-conditioners. First, the time series of realtime electricity rate, renewable energy, and energy demand are modeled by echo state networks as periodic functions. Second, given these periodic functions, a Q-learning based algorithm is developed for optimal control of the battery in the office, so that the total cost on energy from the grid is reduced. Finally, numerical analysis is conducted to show the performance of the developed algorithm.
引用
收藏
页码:3318 / 3322
页数:5
相关论文
共 50 条
  • [31] Optimal path planning approach based on Q-learning algorithm for mobile robots
    Maoudj, Abderraouf
    Hentout, Abdelfetah
    APPLIED SOFT COMPUTING, 2020, 97
  • [32] Q-Learning Enhanced Gradient Based Routing for Balancing Energy Consumption in WSNs
    Debowski, Bazyli
    Spachos, Petros
    Areibi, Shawki
    2016 IEEE 21ST INTERNATIONAL WORKSHOP ON COMPUTER AIDED MODELLING AND DESIGN OF COMMUNICATION LINKS AND NETWORKS (CAMAD), 2016, : 18 - 23
  • [33] Genetic Based Fuzzy Q-Learning Energy Management for Smart Grid
    Li Xin
    Zang Chuanzhi
    Zeng Peng
    Yu Haibin
    PROCEEDINGS OF THE 31ST CHINESE CONTROL CONFERENCE, 2012, : 6924 - 6927
  • [34] Efficient energy management in smart grids with finite horizon Q-learning
    Vivek, V. P.
    Bhatnagar, Shalabh
    SUSTAINABLE ENERGY GRIDS & NETWORKS, 2024, 38
  • [35] Rule and Q-learning based Hybrid Energy Management for Electric Vehicle
    Li, Yang
    Tao, Jili
    Han, Kai
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 51 - 56
  • [36] Enhancing Q-Learning for optimal asset allocation
    Neuneier, R
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 10, 1998, 10 : 936 - 942
  • [37] Antenna Selection in Energy Harvesting Relaying Networks Using Q-Learning Algorithm
    Daliang Ouyang
    Rui Zhao
    Yuanjian Li
    Rongxin Guo
    Yi Wang
    中国通信, 2021, 18 (04) : 64 - 75
  • [38] Q-learning algorithm based method for enhancing resiliency of integrated energy system
    Wu X.
    Tang Z.
    Xu Q.
    Zhou Y.
    Dianli Zidonghua Shebei/Electric Power Automation Equipment, 2020, 40 (04): : 146 - 152
  • [39] Antenna Selection in Energy Harvesting Relaying Networks Using Q-Learning Algorithm
    Ouyang, Daliang
    Zhao, Rui
    Li, Yuanjian
    Guo, Rongxin
    Wang, Yi
    CHINA COMMUNICATIONS, 2021, 18 (04) : 64 - 75
  • [40] Exponential Moving Average Q-Learning Algorithm
    Awheda, Mostafa D.
    Schwartz, Howard M.
    PROCEEDINGS OF THE 2013 IEEE SYMPOSIUM ON ADAPTIVE DYNAMIC PROGRAMMING AND REINFORCEMENT LEARNING (ADPRL), 2013, : 31 - 38