Optimal Electric Vehicle Battery Management Using Q-learning for Sustainability

被引:3
|
作者
Suanpang, Pannee [1 ]
Jamjuntr, Pitchaya [2 ]
机构
[1] Suan Dusit Univ, Fac Sci & Technol, Dept Informat Technol, Bangkok 10300, Thailand
[2] King Mongkuts Univ Technol Thonburi, Fac Engn, Dept Elect & Telecommun Engn, Bangkok 10140, Thailand
关键词
optimizing; Q-learning; battery management; electric vehicle; sustainability; enhancing performance; smart city;
D O I
10.3390/su16167180
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
This paper presents a comprehensive study on the optimization of electric vehicle (EV) battery management using Q-learning, a powerful reinforcement learning technique. As the demand for electric vehicles continues to grow, there is an increasing need for efficient battery-management strategies to extend battery life, enhance performance, and minimize operating costs. The primary objective of this research is to develop and assess a Q-learning-based approach to address the intricate challenges associated with EV battery management. This paper starts by elucidating the key challenges inherent in EV battery management and discusses the potential advantages of incorporating Q-learning into the optimization process. Leveraging Q-learning's capacity to make dynamic decisions based on past experiences, we introduce a framework that considers state-of-charge, state-of-health, charging infrastructure, and driving patterns as critical state variables. The methodology is detailed, encompassing the selection of state, action, reward, and policy, with the training process informed by real-world data. Our experimental results underscore the efficacy of the Q-learning approach in optimizing battery management. Through the utilization of Q-learning, we achieve substantial enhancements in battery performance, energy efficiency, and overall EV sustainability. A comparative analysis with traditional battery-management strategies is presented to highlight the superior performance of our approach. A comparative analysis with traditional battery-management strategies is presented to highlight the superior performance of our approach, demonstrating compelling results. Our Q-learning-based method achieves a significant 15% improvement in energy efficiency compared to conventional methods, translating into substantial savings in operational costs and reduced environmental impact. Moreover, we observe a remarkable 20% increase in battery lifespan, showcasing the effectiveness of our approach in enhancing long-term sustainability and user satisfaction. This paper significantly enriches the body of knowledge on EV battery management by introducing an innovative, data-driven approach. It provides a comprehensive comparative analysis and applies novel methodologies for practical implementation. The implications of this research extend beyond the academic sphere to practical applications, fostering the broader adoption of electric vehicles and contributing to a reduction in environmental impact while enhancing user satisfaction.
引用
收藏
页数:50
相关论文
共 50 条
  • [21] Planning a Sustainable Electric Vehicle Infrastructure Considering Battery Life: Modeling and Resolution by the Multi-agent Q-learning Metaheuristic
    Elbaz, Hassane
    Bourzik, Mohammed
    Alaoui, Ahmed Elhilali
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2024, 49 (09) : 12537 - 12559
  • [22] BEAM MANAGEMENT SOLUTION USING Q-LEARNING FRAMEWORK
    Araujo, Daniel C.
    de Almeida, Andre L. F.
    2019 IEEE 8TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP 2019), 2019, : 594 - 598
  • [23] Improvement of electric vehicle safety using a new hybrid fuzzy Q-learning algorithm for lithium-ion battery state-of-charge estimation
    Bourenane, Haiat
    Berkani, Abderrahmane
    Negadi, Karim
    Guemmour, Mohamed Boutkhil
    INTERNATIONAL JOURNAL OF DYNAMICS AND CONTROL, 2024, 12 (11) : 4079 - 4096
  • [24] Zap Q-Learning for Optimal Stopping
    Chen, Shuhang
    Devraj, Adithya M.
    Busic, Ana
    Meyn, Sean
    2020 AMERICAN CONTROL CONFERENCE (ACC), 2020, : 3920 - 3925
  • [25] Energy management strategy for electric vehicles based on deep Q-learning using Bayesian optimization
    Kong, Huifang
    Yan, Jiapeng
    Wang, Hai
    Fan, Lei
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (18): : 14431 - 14445
  • [26] Energy management strategy for electric vehicles based on deep Q-learning using Bayesian optimization
    Huifang Kong
    Jiapeng Yan
    Hai Wang
    Lei Fan
    Neural Computing and Applications, 2020, 32 : 14431 - 14445
  • [27] Optimal control using adaptive resonance theory and Q-learning
    Kiumarsi, Bahare
    AlQaudi, Bakur
    Modares, Hamidreza
    Lewis, Frank L.
    Levine, Daniel S.
    NEUROCOMPUTING, 2019, 361 : 119 - 125
  • [28] MQLV: Optimal Policy of Money Management in Retail Banking with Q-Learning
    Charlier, Jeremy
    Ormazabal, Gaston
    State, Radu
    Hilger, Jean
    MINING DATA FOR FINANCIAL APPLICATIONS, 2020, 11985 : 1 - 15
  • [29] Optimal Management of Office Energy Consumption via Q-learning Algorithm
    Shi, Guang
    Liu, Derong
    Wei, Qinglai
    2017 AMERICAN CONTROL CONFERENCE (ACC), 2017, : 3318 - 3322
  • [30] Electric Vehicle Battery Management using Digital Twin
    Eaty, Naga Durga Krishna Mohan
    Bagade, Priyanka
    2022 IEEE INTERNATIONAL CONFERENCE ON OMNI-LAYER INTELLIGENT SYSTEMS (IEEE COINS 2022), 2022, : 353 - 357