An Optimum E-Vehicle Energy Management System using Deep Reinforcement Learning

An Optimum E-Vehicle Energy Management System using Deep Reinforcement Learning

  IJETT-book-cover           
  
© 2023 by IJETT Journal
Volume-71 Issue-5
Year of Publication : 2023
Author : S. Manoj, S. Pradeep Kumar
DOI : 10.14445/22315381/IJETT-V71I5P223

How to Cite?

S. Manoj, S. Pradeep Kumar, "An Optimum E-Vehicle Energy Management System using Deep Reinforcement Learning," International Journal of Engineering Trends and Technology, vol. 71, no. 5, pp. 219-227, 2023. Crossref, https://doi.org/10.14445/22315381/IJETT-V71I5P223

Abstract
A growing body of evidence indicates that incorporating onboard computer vision hardware and software into modern automotive systems aids in the pursuit of eco-driving goals. Automotive engineers face a lengthy and tedious task when developing Energy Management Strategies (EMSs) for various hybrid electric vehicle configurations. By capitalizing on similarities between various hybrid electric vehicle EMSs, experienced engineers can shorten the development cycle. This automated EMS development framework aims to speed up the production of hybrid electric vehicles. The study presented here combines computer vision with deep reinforcement learning, which leads to an improvement in the fuel economy of hybrid electric cars. The proposed method can autonomously learn the best policy for control based on observed data. We employ the cutting-edge convolutional neural networks-based object detection technique to glean useful visual data from onboard cameras. A continuous deep reinforcement learning model takes the detected visual data as a state input and generates policies for conserving power. To be more precise, the sharing of information among four very different hybrid electric vehicle types is investigated. In this paper, we propose a transfer learning-based tactic to automate the improvement of hybrid electric vehicle EMSs through the exchange of cross-type knowledge between EMSs that employ various flavors of deep reinforcement learning. According to the findings, the proposed method achieves the highest possible fuel efficiency of the global optimization programming, and the depth reinforcement learning-based system with image perception uses less fuel than the one without visual information. Moreover, the system without visual information uses less fuel than the one with visual information. Battery modeling, accurate battery state of charge and state of health estimation, and the development of other advanced EMS in EVs can solve most of the problems, allowing for more precise driving range estimates and more efficient charging and discharging strategies. The proposed strategy was shown to be effective and reliable in reducing losses and increasing safety during training and validation. The proposed energy management strategy performed better than the methods that were based on deep learning in terms of the amount of time needed for computation and the amount of energy lost in the combination battery bank. This provides support for the utilization of this method in the development of future systems for managing energy.

Keywords
Electric Vehicle (EV), Computer vision, BMS, Deep Reinforcement Learning.

References
[1] Wisdom Enang, and Chris Bannister, “Modelling and Control of Hybrid Electric Vehicles (A Comprehensive Review),” Renewable and Sustainable Energy Reviews, vol. 74, pp. 1210–1239, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[2] Clara Marina Martinez et al., “Energy Management in Plug-In Hybrid Electric Vehicles: Recent Progress and a Connected Vehicles Perspective,” IEEE Transactions on Vehicular Technology, vol. 66, no. 6, pp. 4534–4549, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[3] M.F. M. Sabri, K.A. Danapalasingam, and M.F. Rahmat, “A Review on Hybrid Electric Vehicles Architecture and Energy Management Strategies,” Renewable and Sustainable Energy Reviews, vol. 53, pp. 1433–1442, 2016.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Pei Zhang, Fuwu Yan, and Changqing Du, “A Comprehensive Analysis of Energy Management Strategies for Hybrid Electric Vehicles Based on Bibliometrics,” Renewable and Sustainable Energy Reviews, vol. 48, pp. 88–104, 2015.
[CrossRef] [Google Scholar] [Publisher Link]
[5] Niklas Rotering, and Marija Ilic, “Optimal Charge Control of Plug-In Hybrid Electric Vehicles in Deregulated Electricity Markets,” IEEE Transactions on Power Systems, vol. 26, no. 3, pp. 1021–1029, 2010.
[CrossRef] [Google Scholar] [Publisher Link]
[6] Oliver Bohlen, Julia Kowal, and Dirk Uwe Sauer, “Ageing Behaviour of Electrochemical Double Layer Capacitors: Part I. Experimental Study and Ageing Model,” Journal of Power Sources, vol. 172, no. 1, pp. 468-475, 2007.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Bote Zhao et al., “A Comprehensive Review of Li4ti5o12-Based Electrodes for Lithium-Ion Batteries: the Latest Advancements and Future Perspectives,” Materials Science and Engineering: R: Reports, vol. 98, pp. 1-71, 2015.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Raphael Wegmann et al., “Optimized Operation of Hybrid Battery Systems for Electric Vehicles Using Deterministic and Stochastic Dynamic Programming,” Journal of Energy Storage, vol. 14, pp. 22-38, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Jan Becker et al., “Dimensioning and Optimization of Hybrid Li-Ion Battery Systems for Evs,” World Electric Vehicle Journal, vol. 9, no. 2, p. 19, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Thomas Nemeth et al., “Optimized Operation of a Hybrid Energy Storage System with LTO Batteries for High Power Electrified Vehicles,” 2019 IEEE Transportation Electrification Conference and Expo, IEEE, pp. 1–6, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[11] Markus Lelie et al., “Battery Management System Hardware Concepts: An Overview,” Applied Sciences, vol. 8, no. 4, p. 534, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[12] YongZhi Zhang et al., “Validation and Verification of a Hybrid Method for Remaining Useful Life Prediction of Lithium-Ion Batteries,” Journal of Cleaner Production, vol. 212, pp. 240-249, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[13] M.S. Hossain Lipu et al., “Data-Driven State of Charge Estimation of Lithium-Ion Batteries: Algorithms, Implementation Factors, Limitations and Future Trends,” Journal of Cleaner Production, vol. 277, p. 124110, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Guodong Du et al., “Deep Reinforcement Learning Based Energy Management for a Hybrid Electric Vehicle,” Energy, vol. 201, pp. 117591, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[15] Teng Liu et al., “Reinforcement Learning of Adaptive Energy Management with Transition Probability for a Hybrid Electric Tracked Vehicle,” IEEE Transactions on Industrial Electronics, vol. 62, no. 12, pp. 7837-7846. 2015.
[CrossRef] [Google Scholar] [Publisher Link]
[16] Mevludin Glavic, “(Deep) Reinforcement Learning for Electric Power System Control and Related Problems: A Short Review and Perspectives,” Annual Reviews in Control, vol. 48, pp. 22-35, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Jingda Wu et al., “Continuous Reinforcement Learning of Energy Management with Deep Q Network for a Power Split Hybrid Electric Bus,” Applied Energy, vol. 222, pp. 799-811, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[18] Huachun Tan et al., “Energy Management of Hybrid Electric Bus Based on Deep Reinforcement Learning in Continuous State and Action Space,” Energy Conversion and Management, vol. 195, pp. 548-560, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[19] Yuecheng Li et al., “Energy Management for a Power-Split Hybrid Electric Bus Via Deep Reinforcement Learning with Terrain Information,” Applied Energy, vol. 255, pp. 113762, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[20] Yuankai Wu et al., “Deep Reinforcement Learning of Energy Management with Continuous Control Strategy and Traffic Information for a Series-Parallel Plug-In Hybrid Electric Bus,” Applied Energy, vol. 247, pp. 454–466, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[21] Renzong Lian et al., “Rule-Interposing Deep Reinforcement Learning Based Energy Management Strategy for a Power-Split Hybrid Electric Vehicle,” Energy, vol. 197, pp. 117297, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[22] Clara Marina Martínez et al., “Energy Management in Plug-in Hybrid Electric Vehicles: Recent Progress and a Connected Vehicles Perspective,” IEEE Transactions on Vehicular Technology, vol. 66, no. 6, pp. 4534- 4549, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[23] Sidun Fang et al., “Optimal Hierarchical Management of Shipboard Multibattery Energy Storage System Using a Data-Driven Degradation Model,” IEEE Transactions on Transportation Electrification, vol. 5, no. 4, pp. 1306-1318, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[24] Zhao Xiu-Chun, and Guo Ge, “Survey on Energy Management Strategies for Hybrid Electric Vehicles,” Acta Automatica Sinica, vol. 3, pp. 321-334. 2016.
[CrossRef] [Google Scholar] [Publisher Link]
[25] Jiankun Peng, Hongwen He, and Rui Xiong, “Rule Based Energy Management Strategy for a Series–Parallel Plug-In Hybrid Electric Bus Optimized by Dynamic Programming,” Applied Energy, vol. 185, pp. 1633-1643, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[26] S. G. Li et al., “Energy and Battery Management of a Plug-In Series Hybrid Electric Vehicle Using Fuzzy Logic,” IEEE Transactions on Vehicular Technology, vol. 60, no. 8, pp. 3571–3585, 2011.
[CrossRef] [Google Scholar] [Publisher Link]
[27] Saman Ahmadi et al., “Improving Fuel Economy and Performance of a Fuel-Cell Hybrid Electric Vehicle (Fuel-Cell, Battery, and Ultra-Capacitor) Using Optimized Energy Management Strategy,” Energy Conversion and Management, vol. 160, pp. 74-84, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[28] Daming Zhou et al., “A Comparative Study of Extremum Seeking Methods Applied to Online Energy Management Strategy of Fuel Cell Hybrid Electric Vehicles,” Energy Conversion and Management, vol. 151, pp. 778-790, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[29] Jihun Han, Youngjin Park, and Dongsuk Kum, “Optimal Adaptation of Equivalent Factor of Equivalent Consumption Minimization Strategy for Fuel Cell Hybrid Electric Vehicles under Active State Inequality Constraints,” Journal of Power Sources, vol. 267, pp. 491-502. 2014.
[CrossRef] [Google Scholar] [Publisher Link]
[30] S. Manoj, and S. Pradeep Kumar, "Deep Learning-Based State-of-Charge Assessment Model for Hybrid Electric Vehicles Energy Management Systems," SSRG International Journal of Electrical and Electronics Engineering, vol. 10, no. 1, pp. 209-218, 2023.
[CrossRef] [Publisher Link]
[31] Pinak Tulpule, Vincenzo Marano, and Giorgio Rizzoni, “Energy Management for Plug-In Hybrid Electric Vehicles Using Equivalent Consumption Minimi-Zation Strategy,” International Journal of Electric and Hybrid Vehicles, vol. 2, no. 4, pp. 329-350, 2010.
[CrossRef] [Google Scholar] [Publisher Link]
[32] Yanjun Huang et al., “Model Predictive Control Power Management Strategies for HEVs: A Review,” Jounal of Power Sources, vol. 341, pp. 91-106, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[33] Fengqi Zhang, Junqiang Xi, and Reza Langari, “Real-Time Energy Management Strategy based on Velocity Forecasts Using V2V and V2I Communications,” IEEE Transactions on Intelligent Transportation Systems, vol. 18, no. 2, pp. 416-430, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[34] Timothy P. Lillicrap et al., “Continuous Control with Deep Reinforcement Learning,” arXiv preprint arXiv:1509.02971, 2015.
[CrossRef] [Google Scholar] [Publisher Link]
[35] Ziming Yan, and Yan Xu, “Data-Driven Load Frequency Control for Stochastic Power Systems: A Deep Reinforcement Learning Method with Continuous Action Search,” IEEE Transactions on Power Systems, vol. 34, no. 2, pp. 1653-1656, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[36] Namwook Kim, Jongryeol Jeong, and Chunhua Zheng, “Adaptive Energy Management Strategy for Plug-in Hybrid Electric Vehicles with Pontryagin's Minimum Principle Based on Daily Driving Patterns,” International Journal of Precision Engineering and Manufacturing-Green Technology, vol. 6, no. 3, pp. 539-548, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[37] Roman Liessner et al., “Deep Reinforcement Learning for Advanced Energy Management of Hybrid Electric Vehicles,” The 10th International Conference on Agents and Artificial Intelligence, vol. 2, pp. 61-72, 2018.
[CrossRef] [Google Scholar] [Publisher Link]