Deep reinforcement learning for wind and energy storage coordination in wholesale energy and ancillary service markets

被引:5
|
作者
Li, Jinhao [1 ]
Wang, Changlong [2 ,3 ]
Wang, Hao [1 ,3 ]
机构
[1] Monash Univ, Fac Informat Technol, Dept Data Sci & AI, Melbourne, Vic, Australia
[2] Monash Univ, Dept Civil Engn, Melbourne, Vic, Australia
[3] Monash Univ, Monash Energy Inst, Melbourne, Vic, Australia
基金
澳大利亚研究理事会;
关键词
Wind-battery system; Wind curtailment; Electricity market; Deep reinforcement learning; SYSTEMS; UNITS;
D O I
10.1016/j.egyai.2023.100280
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Wind energy has been increasingly adopted to mitigate climate change. However, the variability of wind energy causes wind curtailment, resulting in considerable economic losses for wind farm owners. Wind curtailment can be reduced using battery energy storage systems (BESS) as onsite backup sources. Yet, this auxiliary role may significantly weaken the economic potential of BESS in energy trading. Ideal BESS scheduling should balance onsite wind curtailment reduction and market bidding, but practical implementation is challenging due to coordination complexity and the stochastic nature of energy prices and wind generation. We investigate the joint-market bidding strategy of a co-located wind-battery system in the spot and Regulation Frequency Control Ancillary Service markets. We propose a novel deep reinforcement learning-based approach that decouples the system's market participation into two related Markov decision processes for each facility, enabling the BESS to absorb onsite wind curtailment while performing joint-market bidding to maximize overall operational revenues. Using realistic wind farm data, we validated the coordinated bidding strategy, with outcomes surpassing the optimization-based benchmark in terms of higher revenue by approximately 25% and more wind curtailment reduction by 2.3 times. Our results show that joint-market bidding can significantly improve the financial performance of wind-battery systems compared to participating in each market separately. Simulations also show that using curtailed wind generation as a power source for charging the BESS can lead to additional financial gains. The successful implementation of our algorithm would encourage co-location of generation and storage assets to unlock wider system benefits.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] Investment strategies for energy storage systems in a joint energy and frequency ancillary service market
    Chen, Sheng
    Zhou, Jingwen
    Han, Haiteng
    Zhang, Xuan
    Zhou, Yizhou
    Wei, Zhinong
    JOURNAL OF ENERGY STORAGE, 2025, 116
  • [42] Battery and Hydrogen Energy Storage Control in a Smart Energy Network with Flexible Energy Demand Using Deep Reinforcement Learning
    Samende, Cephas
    Fan, Zhong
    Cao, Jun
    Fabian, Renzo
    Baltas, Gregory N.
    Rodriguez, Pedro
    ENERGIES, 2023, 16 (19)
  • [43] Control Strategy of Microgrid Energy Storage System Based on Deep Reinforcement Learning
    Liang H.
    Li H.
    Zhang H.
    Hu Z.
    Qin Z.
    Cao J.
    Dianwang Jishu/Power System Technology, 2021, 45 (10): : 3869 - 3876
  • [44] Optimizing airborne wind energy with reinforcement learning
    N. Orzan
    C. Leone
    A. Mazzolini
    J. Oyero
    A. Celani
    The European Physical Journal E, 2023, 46
  • [45] A Integrated Energy Service Channel Optimization Mechanism Based on Deep Reinforcement Learning
    Ma Q.-L.
    Yu P.
    Wu J.-H.
    Xiong A.
    Yan Y.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2020, 43 (02): : 87 - 93
  • [46] Event-Based Deep Reinforcement Learning for Smoothing Ramp Events in Combined Wind-Storage Energy Systems
    Li, Jiang
    Wang, Xing
    Liu, Bo
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (05) : 7871 - 7882
  • [47] Multi-objective energy dispatch with deep reinforcement learning for wind-solar-thermal-storage hybrid systems
    Wang, Conghao
    Ma, Yan
    Xie, Jingjing
    Ouyang, Quan
    JOURNAL OF ENERGY STORAGE, 2025, 105
  • [48] Optimizing airborne wind energy with reinforcement learning
    Orzan, N.
    Leone, C.
    Mazzolini, A.
    Oyero, J.
    Celani, A.
    EUROPEAN PHYSICAL JOURNAL E, 2023, 46 (01):
  • [49] Voltage Control-Based Ancillary Service Using Deep Reinforcement Learning
    Lukianykhin, Oleh
    Bogodorova, Tetiana
    ENERGIES, 2021, 14 (08)
  • [50] Distributed Online Service Coordination Using Deep Reinforcement Learning
    Schneider, Stefan
    Qarawlus, Haydar
    Karl, Holger
    2021 IEEE 41ST INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2021), 2021, : 539 - 549