Optimal Management of Office Energy Consumption via Q-learning Algorithm

被引:0
|
作者
Shi, Guang [1 ]
Liu, Derong [2 ]
Wei, Qinglai [1 ]
机构
[1] Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100190, Peoples R China
[2] Univ Sci & Technol Beijing, Sch Automat & Elect Engn, Beijing 100083, Peoples R China
基金
中国国家自然科学基金;
关键词
ECHO STATE NETWORK; RECOGNITION; SYSTEMS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, a Q-learning based algorithm is developed to optimize energy consumption in an office, where solar energy is introduced as the renewable source and a battery is installed as the control unit. The energy consumption in the office, regarded as the energy demand, is divided into those from sockets, lights and air-conditioners. First, the time series of realtime electricity rate, renewable energy, and energy demand are modeled by echo state networks as periodic functions. Second, given these periodic functions, a Q-learning based algorithm is developed for optimal control of the battery in the office, so that the total cost on energy from the grid is reduced. Finally, numerical analysis is conducted to show the performance of the developed algorithm.
引用
收藏
页码:3318 / 3322
页数:5
相关论文
共 50 条
  • [1] Q-learning algorithm for optimal multilevel thresholding
    Yin, PY
    IC-AI'2001: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOLS I-III, 2001, : 335 - 340
  • [2] Q-Learning Algorithm and CMAC Approximation Based Robust Optimal Control for Renewable Energy Management Systems
    Vy Huynh Tuyet
    Luy Nguyen Tan
    CONTROL ENGINEERING AND APPLIED INFORMATICS, 2022, 24 (01): : 15 - 25
  • [3] An Online Home Energy Management System using Q-Learning and Deep Q-Learning
    Izmitligil, Hasan
    Karamancioglu, Abdurrahman
    SUSTAINABLE COMPUTING-INFORMATICS & SYSTEMS, 2024, 43
  • [4] Fundamental Q-learning Algorithm in Finding Optimal Policy
    Sun, Canyu
    2017 INTERNATIONAL CONFERENCE ON SMART GRID AND ELECTRICAL AUTOMATION (ICSGEA), 2017, : 243 - 246
  • [5] A Q-Learning Approach to Derive Optimal Consumption and Investment Strategies
    Weissensteiner, Alex
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (08): : 1234 - 1243
  • [6] Q-learning Algorithm for Energy Management in Solar Powered Embedded Monitoring Systems
    Prauzek, Michal
    Mourcet, Nicolas R. A.
    Hlavica, Jakub
    Musilek, Petr
    2018 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2018, : 1068 - 1074
  • [7] Backward Q-learning: The combination of Sarsa algorithm and Q-learning
    Wang, Yin-Hao
    Li, Tzuu-Hseng S.
    Lin, Chih-Jui
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2013, 26 (09) : 2184 - 2193
  • [8] Optimal Trajectory Output Tracking Control with a Q-learning Algorithm
    Vamvoudakis, Kyriakos G.
    2016 AMERICAN CONTROL CONFERENCE (ACC), 2016, : 5752 - 5757
  • [9] Nonlinear neuro-optimal tracking control via stable iterative Q-learning algorithm
    Wei, Qinglai
    Song, Ruizhuo
    Sun, Qiuye
    NEUROCOMPUTING, 2015, 168 : 520 - 528
  • [10] Approximate Cost-Optimal Energy Management of Hydrogen Electric Multiple Unit Trains Using Double Q-Learning Algorithm
    Li, Qi
    Meng, Xiang
    Gao, Fei
    Zhang, Guorui
    Chen, Weirong
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2022, 69 (09) : 9099 - 9110