Reinforcement learning based demand charge minimization using energy storage

被引:0
|
作者
Weber, Lucas [1 ,2 ]
Busic, Ana [1 ,2 ]
Zhu, Jiamin [3 ]
机构
[1] PSL Res Univ, INRIA, Paris, France
[2] PSL Res Univ, DI ENS, Ecole Normale Super, CNRS, Paris, France
[3] IFP Energies Nouvelles, 1&4 Ave Bois Preau, F-92852 Rueil Malmaison, France
关键词
D O I
10.1109/CDC49753.2023.10383414
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Utilities have introduced demand charges to encourage customers to reduce their demand peaks, since a high peak may cause very high costs for both the utility and the consumer. We herein study the bill minimization problem for customers equipped with an energy storage device and a self-owned renewable energy production. A model-free reinforcement learning algorithm is carefully designed to reduce both the energy charge and the demand charge of the consumer. The proposed algorithm does not need forecasting models for the energy demand and the renewable energy production. The resulting controller can be used online, and progressively improved with newly gathered data. The algorithm is validated on real data from an office building of IFPEN Solaize site. Numerical results show that our algorithm can reduce electricity bills with both daily and monthly demand charges.
引用
收藏
页码:4351 / 4357
页数:7
相关论文
共 50 条
  • [31] On Certainty Equivalence of Demand Charge Reduction Using Storage
    Yu, Jiafan
    Qin, Junjie
    Rajagopal, Ram
    2017 AMERICAN CONTROL CONFERENCE (ACC), 2017, : 3430 - 3437
  • [32] Bipedal Walking Energy Minimization by Reinforcement Learning with Evolving Policy Parameterization
    Kormushev, Petar
    Ugurlu, Barkan
    Calinon, Sylvain
    Tsagarakis, Nikolaos G.
    Caldwell, Darwin G.
    2011 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2011, : 318 - 324
  • [33] An Integrated Demand Response-Based Energy Management Strategy for Integrated Energy System Based on Deep Reinforcement Learning
    Han, Baohui
    Hu, Mingjie
    Lv, Shilin
    Bao, Zhejing
    Lu, Lingxia
    Yu, Miao
    2023 6TH INTERNATIONAL CONFERENCE ON RENEWABLE ENERGY AND POWER ENGINEERING, REPE 2023, 2023, : 407 - 412
  • [34] Joint Supply, Demand, and Energy Storage Management Towards Microgrid Cost Minimization
    Sun, Sun
    Dong, Min
    Liang, Ben
    2014 IEEE INTERNATIONAL CONFERENCE ON SMART GRID COMMUNICATIONS (SMARTGRIDCOMM), 2014, : 109 - 114
  • [35] Deep reinforcement learning for energy management in a microgrid with flexible demand
    Nakabi, Taha Abdelhalim
    Toivanen, Pekka
    SUSTAINABLE ENERGY GRIDS & NETWORKS, 2021, 25
  • [36] Using reinforcement learning to improve exploration trajectories for error minimization
    Kollar, Thomas
    Roy, Nicholas
    2006 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), VOLS 1-10, 2006, : 3338 - +
  • [37] Autonomous power management in mobile devices using dynamic frequency scaling and reinforcement learning for energy minimization
    Carvalho, Sidartha A. L.
    Cunha, Daniel C.
    Silva-Filho, Abel G.
    MICROPROCESSORS AND MICROSYSTEMS, 2019, 64 (205-220) : 205 - 220
  • [38] A Novel Stackelberg-Game-Based Energy Storage Sharing Scheme Under Demand Charge
    Li, Bingyun
    Yang, Qinmin
    Kamwa, Innocent
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2023, 10 (02) : 462 - 473
  • [39] A Novel Stackelberg-Game-Based Energy Storage Sharing Scheme Under Demand Charge
    Bingyun Li
    Qinmin Yang
    Innocent Kamwa
    IEEE/CAAJournalofAutomaticaSinica, 2023, 10 (02) : 462 - 473
  • [40] CONTROL OF SHARED ENERGY STORAGE ASSETS WITHIN BUILDING CLUSTERS USING REINFORCEMENT LEARNING
    Odonkor, Philip
    Lewis, Kemper
    PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2018, VOL 2A, 2018,