A Speedy Reinforcement Learning-Based Energy Management Strategy for Fuel Cell Hybrid Vehicles Considering Fuel Cell System Lifetime

被引:35
|
作者
Li, Wei [1 ,2 ]
Ye, Jiaye [1 ]
Cui, Yunduan [1 ]
Kim, Namwook [3 ]
Cha, Suk Won [4 ]
Zheng, Chunhua [1 ]
机构
[1] Chinese Acad Sci, Shenzhen Univ Town, Shenzhen Inst Adv Technol, 1068 Xueyuan Ave, Shenzhen 518055, Peoples R China
[2] Univ Chinese Acad Sci, 19 A Yuquan Rd, Beijing 100049, Peoples R China
[3] Hanyang Univ, Dept Mech Engn, 55 Hanyangdeahak Ro, Ansan 15588, Gyeonggi Do, South Korea
[4] Seoul Natl Univ, Sch Mech & Aerosp Engn, San 56-1, Seoul 151742, South Korea
关键词
Energy management strategy; Fuel cell hybrid vehicle; Lifetime enhancement; Pre-initialization; Speedy reinforcement learning; PONTRYAGINS MINIMUM PRINCIPLE; ELECTRIC VEHICLES; POWER MANAGEMENT; MODEL; PREDICTION;
D O I
10.1007/s40684-021-00379-8
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
A speedy reinforcement learning (RL)-based energy management strategy (EMS) is proposed for fuel cell hybrid vehicles (FCHVs) in this research, which approaches near-optimal results with a fast convergence rate based on a pre-initialization framework and meanwhile possesses the ability to extend the fuel cell system (FCS) lifetime. In the pre-initialization framework, well-designed power distribution-related rules are used to pre-initialize the Q-table of the RL algorithm to expedite its optimization process. Driving cycles are modeled as Markov processes and the FCS power difference between adjacent moments is used to evaluate the impact on the FCS lifetime in this research. The proposed RL-based EMS is trained on three driving cycles and validated on another driving cycle. Simulation results demonstrate that the average fuel consumption difference between the proposed EMS and the EMS based on dynamic programming is 5.59% on the training driving cycles and the validation driving cycle. Additionally, the power fluctuation on the FCS is reduced by at least 13% using the proposed EMS compared to the conventional RL-based EMS which does not consider the FCS lifetime. This is significantly beneficial for improving the FCS lifetime. Furthermore, compared to the conventional RL-based EMS, the convergence speed of the proposed EMS is increased by 69% with the pre-initialization framework, which presents the potential for real-time applications.
引用
收藏
页码:859 / 872
页数:14
相关论文
共 50 条
  • [1] A Speedy Reinforcement Learning-Based Energy Management Strategy for Fuel Cell Hybrid Vehicles Considering Fuel Cell System Lifetime
    Wei Li
    Jiaye Ye
    Yunduan Cui
    Namwook Kim
    Suk Won Cha
    Chunhua Zheng
    International Journal of Precision Engineering and Manufacturing-Green Technology, 2022, 9 : 859 - 872
  • [2] Reinforcement Learning-Based Energy Management for Fuel Cell Electrical Vehicles Considering Fuel Cell Degradation
    Shuai, Qilin
    Wang, Yiheng
    Jiang, Zhengxiong
    Hua, Qingsong
    ENERGIES, 2024, 17 (07)
  • [3] Deep reinforcement learning based energy management strategy of fuel cell hybrid railway vehicles considering fuel cell aging
    Deng, Kai
    Liu, Yingxu
    Hai, Di
    Peng, Hujun
    Lowenstein, Lars
    Pischinger, Stefan
    Hameyer, Kay
    ENERGY CONVERSION AND MANAGEMENT, 2022, 251
  • [4] Deep reinforcement learning based energy management strategy of fuel cell hybrid railway vehicles considering fuel cell aging
    Deng, Kai
    Liu, Yingxu
    Hai, Di
    Peng, Hujun
    Löwenstein, Lars
    Pischinger, Stefan
    Hameyer, Kay
    Energy Conversion and Management, 2022, 251
  • [5] Deep stochastic reinforcement learning-based energy management strategy for fuel cell hybrid electric vehicles
    Jouda, Basel
    Al-Mahasneh, Ahmad Jobran
    Abu Mallouh, Mohammed
    ENERGY CONVERSION AND MANAGEMENT, 2024, 301
  • [6] A Deep Reinforcement Learning-Based Energy Management Strategy for Fuel Cell Hybrid Buses
    Chunhua Zheng
    Wei Li
    Weimin Li
    Kun Xu
    Lei Peng
    Suk Won Cha
    International Journal of Precision Engineering and Manufacturing-Green Technology, 2022, 9 : 885 - 897
  • [7] A Deep Reinforcement Learning-Based Energy Management Strategy for Fuel Cell Hybrid Buses
    Zheng, Chunhua
    Li, Wei
    Li, Weimin
    Xu, Kun
    Peng, Lei
    Cha, Suk Won
    INTERNATIONAL JOURNAL OF PRECISION ENGINEERING AND MANUFACTURING-GREEN TECHNOLOGY, 2022, 9 (03) : 885 - 897
  • [8] A reinforcement learning energy management strategy for fuel cell hybrid electric vehicles considering driving condition classification
    Kang, Xu
    Wang, Yujie
    Chen, Zonghai
    SUSTAINABLE ENERGY GRIDS & NETWORKS, 2024, 38
  • [9] Reinforcement Learning based Energy Management for Fuel Cell Hybrid Electric Vehicles
    Guo, Liang
    Li, Zhongliang
    Outbib, Rachid
    IECON 2021 - 47TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2021,
  • [10] An Energy Management Strategy Considering the Economy and Lifetime of Multistack Fuel Cell Hybrid System
    Li, Qi
    Cai, Liangdong
    Yin, Liangzhen
    Wang, Tianhong
    Li, Luoyi
    Xie, Shuqi
    Chen, Weirong
    IEEE TRANSACTIONS ON TRANSPORTATION ELECTRIFICATION, 2023, 9 (02) : 3498 - 3507