Approximate Cost-Optimal Energy Management of Hydrogen Electric Multiple Unit Trains Using Double Q-Learning Algorithm

被引:74
|
作者
Li, Qi [1 ]
Meng, Xiang [1 ]
Gao, Fei [3 ,4 ]
Zhang, Guorui [2 ]
Chen, Weirong [1 ]
机构
[1] Southwest Jiaotong Univ, Dept Elect Engn, Chengdu 611756, Sichuan, Peoples R China
[2] CRRC Qingdao Sifang Co Ltd, Qingdao 266111, Shandong, Peoples R China
[3] Univ Bourgogne Franche Comte, FEMTO ST Inst, Rue Ernest Thierry Mieg, F-90010 Belfort, France
[4] Univ Bourgogne Franche Comte, FCLAB, UTBM, CNRS, Rue Ernest Thierry Mieg, F-90010 Belfort, France
关键词
Fuel cells; Energy management; Optimization; Hydrogen; Resistance; Batteries; Hybrid power systems; fuel cell; hydrogen; rail transportation; POWER MANAGEMENT; HYBRID; STRATEGY; OPTIMIZATION; OPERATION; BATTERY; CONSUMPTION; VEHICLE; SYSTEM;
D O I
10.1109/TIE.2021.3113021
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Energy management strategy (EMS) is the key to the performance of fuel cell / battery hybrid system. At present, reinforcement learning (RL) has been introduced into this field and has gradually become the focus of research. However, traditional EMSs only take the energy consumption into consideration when optimizing the operation economy, and ignore the cost caused by power source degradations. It would cause the problem of poor operation economy regarding Total Cost of Ownership (TCO). On the other hand, most studied RL algorithms have the disadvantages of overestimation and improper way of restricting battery SOC, which would lead to relatively poor control performance as well. To solve these problems, this paper establishes a TCO model including energy consumption, equivalent energy consumption and degradation of power sources at first, then adopt the Double Q-learning RL algorithm with state constraint and variable action space to determine the optimal EMS. Finally, using hardware-in-the-loop platform, the feasibility, superiority and generalization of proposed EMS is proved by comparing with the optimal dynamic programming and traditional RL EMS and equivalent consumption minimum strategy (ECMS) under both training and unknown operating conditions. Results prove that the proposed strategy has high global optimality and excellent SOC control ability regardless of training or unknown conditions.
引用
收藏
页码:9099 / 9110
页数:12
相关论文
共 50 条
  • [1] Optimal Management of Office Energy Consumption via Q-learning Algorithm
    Shi, Guang
    Liu, Derong
    Wei, Qinglai
    2017 AMERICAN CONTROL CONFERENCE (ACC), 2017, : 3318 - 3322
  • [2] Optimal Electric Vehicle Battery Management Using Q-learning for Sustainability
    Suanpang, Pannee
    Jamjuntr, Pitchaya
    SUSTAINABILITY, 2024, 16 (16)
  • [3] Energy management strategy for hybrid electric vehicles based on double Q-learning
    Han, Lijin
    Yang, Ke
    Zhang, Xin
    Yang, Ningkang
    Liu, Hui
    Liu, Jiaxin
    INTERNATIONAL CONFERENCE ON MECHANICAL DESIGN AND SIMULATION (MDS 2022), 2022, 12261
  • [4] Cost-optimal design and energy management of fuel cell electric trucks
    Ferrara, Alessandro
    Jakubek, Stefan
    Hametner, Christoph
    INTERNATIONAL JOURNAL OF HYDROGEN ENERGY, 2023, 48 (43) : 16420 - 16434
  • [5] Energy management based on reinforcement learning with double deep Q-learning for a hybrid electric tracked vehicle
    Han, Xuefeng
    He, Hongwen
    Wu, Jingda
    Peng, Jiankun
    Li, Yuecheng
    APPLIED ENERGY, 2019, 254
  • [6] An Online Home Energy Management System using Q-Learning and Deep Q-Learning
    Izmitligil, Hasan
    Karamancioglu, Abdurrahman
    SUSTAINABLE COMPUTING-INFORMATICS & SYSTEMS, 2024, 43
  • [7] An eco-driving algorithm for trains through distributing energy: A Q-Learning approach
    Zhu, Qingyang
    Su, Shuai
    Tang, Tao
    Liu, Wentao
    Zhang, Zixuan
    Tian, Qinghao
    ISA TRANSACTIONS, 2022, 122 (24-37) : 24 - 37
  • [8] Improved residential energy management system using priority double deep Q-learning
    Mathew, Alwyn
    Jolly, Milan Jeetendra
    Mathew, Jimson
    SUSTAINABLE CITIES AND SOCIETY, 2021, 69
  • [9] Rule and Q-learning based Hybrid Energy Management for Electric Vehicle
    Li, Yang
    Tao, Jili
    Han, Kai
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 51 - 56
  • [10] Energy Optimization of a Base Station using Q-learning Algorithm
    Aggarwal, Anisha
    Selvamuthu, Dharmaraja
    2023 17TH INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS, CONTEL, 2023,