Estimating Battery Life in Electric Vehicles using Deeper Long Short-Term Memory (DLSTM) Algorithm
##plugins.themes.bootstrap3.article.main##
The paper estimates electric vehicle battery life using a deep, long-term memory algorithm (DLSTM). This algorithm employs a Forget Gate with a sigmoid function to retain or discard information from previous states. The Input Gate, also using a sigmoid function, determines new information to add, while a tanh function creates a new vector for updating the cell state. The Cell State Update combines inputs from the forget and input gates. The Output Gate uses a sigmoid function to select which part of the cell state to output. This AI algorithm analyses voltage, current, and temperature during charging to predict the lithium-ion battery’s state.
Downloads
Introduction
Electric cars are an alternative to traditional vehicles such as gasoline and diesel cars. This reduces consumer fuel costs and environmental protection [1], [2]. Therefore, predicting battery life is essential to help manufacturers optimise energy systems and increase vehicle reliability and performance [3]. In addition, consumers need accurate information to make reasonable choices, save costs, and reduce electronic waste. Moreover, estimating battery life also helps manage vehicle performance throughout its life cycle. Understanding factors such as temperature and charging methods will allow users to extend battery life [4], [5]. Currently, there are different methods to estimate battery life for electric vehicles. In addition to traditional methods such as charging and discharging cycle testing, some more modern techniques can be used to calculate the life of electric vehicle batteries. One such method is a numerical simulation, which simulates the behaviours of batteries under different operating conditions, thereby predicting performance and lifetime.
Capacity analysis is also an effective way to estimate lifetime by monitoring the variation of battery capacity over time. Temperature and humidity measurement devices can also provide valuable information about the operating environment, as this directly affects battery lifetime [6]–[8]. Additionally, machine learning methods can be applied to analyse data collected from electric vehicles, thereby creating more accurate predictive models of battery lifetime based on real-world usage conditions [9]–[11]. Finally, using nanomaterials in battery structures also opens new prospects for improving lifetime by enhancing battery endurance and minimising battery degradation.
The above contributions show that estimating battery life using artificial intelligence algorithms gives more accurate results. This paper will present the prediction of the remaining battery life for electric vehicles that can be done using Deep Long Short-Term Memory (DLSTM) neural networks based on the analysis of power degradation. This method allows accurate estimation of RUL, especially in innovative urban environments, by combining federated learning and LSTM [12], [13]. Based on the literature [14]–[16], DLSTM is a type of recurrent neural network (RNN) designed to process and predict sequential data, especially when there is long-term dependence. The LSTM predicts a battery’s remaining life (RUL) by processing time series data, which is crucial for tracking battery status over time. Its unique structure features long-term and short-term memory, enabling it to retain essential degradation patterns. The input gate determines what current information to store, the forget gate decides what to discard from previous states, and the output gate generates the final output. Ultimately, the DLSTM uses historical battery performance data to predict RUL.
The paper consists of five main sections. The first section provides an overview of the goals and solutions for battery life prediction. Section 2 details the battery’s data-gathering process. The following section discusses the training and test data, evaluates the trained model, and presents the predicted and actual cycle life results. Next, section 3 details LSTM layer architecture for machine learning. Then, section 4 expresses training and test data. Finally, section 5 presents the outcomes of the simulation and assessment.
Battery Key Parameters
Battery Temperature Parameter
Managing these temperature fluctuations is crucial for optimising battery performance and ensuring safety in various applications. Closely monitoring and controlling temperature changes can prolong battery life and improve efficiency. The chemical process and I2R contribute to this temperature. Reduced battery capacity leads to a temperature rise, causing significant thermal variation. The formula can calculate temperature changes throughout each cycle, as shown in (1): where E is the electromotive force; Tt is the ith temperature sample in n cycles.
Variation in the Electrode Voltage
The variation in electrode voltage is directly related to how quickly the battery’s capacity decreases. Equation (2) symbolises the change in discharge voltage for each cycle: where E is the electromotive force; Vi is the ith voltage sample in n cycles.
NASA Battery
According to Su et al. [5], the NASA Li-ion dataset predicts remaining useful life:
- Temperature: 24 °C
- Charging: carried out in a constant current (CC) mode at 1.5 A until the battery voltage reached 4.2 V and then continued in a constant voltage (CV) mode until the charge current dropped to 20 mA.
- Discharge: carried out at a constant current (CC) level of 2 A until the battery voltage fell to 2.7 V, 2.5 V, 2.2 V, and 2.5 V for batteries 5, 6, 7, and 18, respectively.
- End Of Life (EOF): The remaining capacity value reaches 70%–80% of initial capacity, i.e., from 2 Ahr to 1.4 Ahr.
LSTM Layer Architecture
Key components include memory cells, activation functions (sigmoid, tanh), and gates (input, forget, and output gates) that manage information storage. These gates are crucial for processing data over time, enabling deeper LSTMs to learn long-term dependencies effectively. This enhances their capability in complex tasks like time series analysis and pattern recognition. The architecture of a Deep LSTM includes the blocks shown in Fig. 1.
Input layer that accepts sequences with 12 features. First, the LSTM layer and the STM layer have 125 hidden units. It outputs a sequence, which means it will output a value for each time step of the input sequence. This information is useful when feeding into the next LSTM layer for further processing. Dropout layer with a rate of 50% helps prevent overfitting by randomly dropping out some neurons during training. The second LSTM Layer with LSTM layer has 200 hidden units. This layer will output only the final hidden state, which is useful when predicting a single. Another dropout layer to help prevent overfitting is similar to the one after the first LSTM layer. A fully connected layer that outputs predictions for each of the nine classes. The output size is equal to numClasses = 9. This layer converts the raw outputs of the fully connected layer into probabilities. Each output will be in the range [0, 1], and they will sum up to 1, which is typical for classification tasks.
These formulas describe the components at time step t: Input gate: Forget gate: Cell candidate: Output gate:
Training and Test Data
Generate Samples
This paper used a reduced dataset of measurements from 40 cells for easier downloading and execution of the example. Each battery’s data is stored in a structure containing information collected within a cycle current, voltage, temperature, and differential discharge capacity. This value fluctuates widely between 150 and 2300 cycles, as the sample data’s histogram illustrates. The dataset contains measurements from 40 lithium-ion batteries (3.3 V, 1.1 Ah) tested across different charge and discharge protocols. Each battery is cycled until it reaches 80% of its capacity, with cycle life varying significantly from 150 to 2300, as illustrated in the histogram in Fig. 2.
Training Parameters
Training parameters include Epoch 88 to 100, a constant learning rate schedule, a learning rate of 0001, a validation frequency of 1190, and a validation patience of 10.
Visuals the data characteristics by plotting current, voltage, and temperature measurements for one complete cycle of the first battery in the data in Fig. 3.
Fig. 3 shows a favourable current for charging and a negative for discharging. A battery is fully charged at 3.6 V and discharged at 2 V. The data includes tests of various fast charging policies to assess battery degradation over time and loads.
Extract the discharge measurements for the first cycle of the first battery, as shown in Fig. 4. Locate the entries corresponding to the first cycle of the first battery to extract the discharge measurements from the dataset. This typically involves filtering the data to encompass only the relevant timestamps and voltage readings specific to that cycle. Once the data is extracted, visualise it using a plotting library such as MATLAB.
Evaluate of Trained Model
This model implements an adaptive moment estimation optimiser widely recognised for its effectiveness in training deep learning algorithms. It operates with a mini-batch size set at 256, allowing for a balanced approach to data processing and gradient updates. A learning rate of 0.001 is used to strike an optimal balance between achieving convergence and avoiding the risk of overshooting during training. Fig. 5 provides a detailed illustration of the outcomes of this methodology.
Predicted and Actual Cycle Life Result
Fig. 6 presents MATLAB results highlighting RMSE and average percentage error. Fig. 7 shows predicted vs. actual cycle life, where RMSE quantifies deviations, indicating model accuracy. Average percentage error helps assess model performance relative to actual data. The comparison in Fig. 7 illustrates model reliability, with proximity reflecting the methods’ effectiveness—these insights aid in enhancing predictive analytics for more robust forecasting.
The MATLAB results show that the root mean squared error (RMSE) is 71.398, and the average percentage error of the predicted remaining cycle life is 18.85. Therefore, the model can predict the remaining useful life across the five batteries when the cycle life is small. This result implies that the model is good at predicting the remaining cycle life as a battery gets closer to the end of its life. Moreover, during the early part of a battery’s life, when the actual cycle life is more significant, the model has more considerable uncertainty. The model also tends to overestimate the remaining cycle life during the initial period of a battery’s life.
A combination of model refinement and real-time data analytics could improve prediction accuracy. Integrating machine learning techniques could help better capture complex relationships within the data, leading to a more robust predictive framework. Continuous validation of the model against real-world data will also be essential to adapt its parameters and improve its responsiveness to varying operational conditions.
Conclusions
The paper used a deeper extended short-term memory model to predict the battery’s cycle life based on information from the first one hundred cycles. While the current model shows promise in predicting the remaining cycle life, particularly as batteries near their end, significant enhancement opportunities remain. By focusing on the initial phase of battery life, we can further develop methodologies that increase the reliability of these predictions, ultimately contributing to more efficient battery utilisation and management in practical applications. Moreover, integrating advanced data analytics and machine learning techniques could provide deeper insights into the factors influencing battery degradation. We can refine our predictive models for broader variables by harnessing real-time data from battery usage patterns and environmental conditions. This holistic approach enhances the accuracy of cycle life predictions and facilitates proactive maintenance strategies.
References
-
Vidal C, Kollmeyer P, Chemali E, Emadi A. Li-ion battery state of charge estimation using long short-term memory recurrent neural network with transfer learning. Proceedings of IEEE Transportation Electrification Conference and Expo, pp. 1–6, Detroit, MI, USA, 2019. doi: 10.1109/ITEC.2019.8790543
Google Scholar
1
-
She C, Zhang L, Wang Z, Sun F, Liu P, Song C. Battery state of health estimation based on incremental capacity analysis method: synthesizing from cell-level test to real-world application. IEEE J Emerg Sel Top Power Electron. 2022;10(1):28–41. doi: 10.1109/JESTPE.2021.3112754.
Google Scholar
2
-
Hu X, Li S, Peng H. A comparative study of equivalent circuit models for Li-ion batteries. J Power Sources. 2012;198:359–67. doi: 10.1016/j.jpowsour.2011.10.013.
Google Scholar
3
-
Samanta A, Chowdhuri S, Williamson SS. Machine learning-based data-driven fault detection/di-agnosis of lithiumion battery: a critical review. Electronics. 2021;10(11):1309. doi: 10.3390/electron-ics10111309.
Google Scholar
4
-
Su X, Wang S, Pecht M, Zhao L, Ye Z. Interacting multiple model particle filter for prognostics of lithium-ion batteries, Microelectron. Reliab. 2017;70:59–69. doi: 10.1016/j.microrel.2017.02.003.
Google Scholar
5
-
Sahinoglu GO, Pajovic M, Sahinoglu Z, Wang YB, Orlik PV, Wada T. Battery state-of-charge estimation based on regular/re-current Gaussian process regression. IEEE Trans Ind Electron. 2018;65:4311–21. doi: 10.1109/TIE.2017.2764869.
Google Scholar
6
-
Ng S, Xing Y, Tsui K. A naive Bayes model for robust remaining useful life prediction of lithium-ion battery. Appl Energy. 2014;118:114–23. doi: 10.1016/j.apenergy.2013.12.020.
Google Scholar
7
-
Wang D, Miao Q, Pecht M. Prognostics of lithium-ion batteries based on relevance vectors and a conditional three-parameter capacity degradation model. J Power Sources. 2013;239:253–64. doi: 10.1016/j.jpowsour.2013.03.129.
Google Scholar
8
-
Liu D, Zhou J, Liao H, Peng Y, Peng X. A health indicator extraction and optimization framework for lithium-ion battery degradation modeling and prognostics. IEEE Trans Syst Man Cybern Syst. 2015;45:915–28. doi: 10.1109/TSMC.2015.2389757.
Google Scholar
9
-
Zhou Y, Huang M, Chen Y, Tao Y. A novel health indicator for online lithium-ion batteries remaining useful life prediction. J Power Sources. 2016;321:1–10. doi: 10.1016/j.jpowsour.2016.04.119.
Google Scholar
10
-
Widodo A, Shim M-C, Caesarendra W, Yang B-S. Intelligent prognostics for battery health monitoring based on sample entropy. Expert Syst Appl. 2011;38:11763–9. doi: 10.1016/j.eswa.2011.03.063.
Google Scholar
11
-
Sandhya P, Ramrao N. Modeling of hybrid active power filter using artificial intelligence controller: hardware and software prospective. Int J Power Electr Drive Syst (IJPEDS). 2021 Dec;12(4):2545–56. ISSN: 2088-8694. doi: 10.11591/ijpeds.v12.i4.pp2545-2556.
Google Scholar
12
-
Liu D, Wang H, Peng Y, Xie W, Liao H. Satellite lithiumion battery remaining cycle life prediction with novel indirect health indicator extraction. Energies. 2013;6:3654–68. doi: 10.3390/en6083654.
Google Scholar
13
-
qis Al B, Harous S, Zaki N, Alnajjar F. Artificial intelligence in education and assessment methods. Bulletin Electr Eng Inform. 2020 Oct;9(5):1998–2007. ISSN: 2302-9285. doi: 10.11591/eei.v9i5.1984.
Google Scholar
14
-
Hannan MA, Lipu MSH, Hussain A. Toward enhanced state of charge estimation of lithium-ion batteries using optimized machine learning techniques. Sci Rep. 2020;10(1):4687–2020. doi: 10.1038/s41598-020-61464-7.
Google Scholar
15
-
Kollmeyer P, Vidal C, Naguib M, Skells M. LG 18650HG2 Li-Ion Battery Data and Example Deep Neural Network XEV SOC Estimator Script. Mendeley; 5 Mar, 2020. doi: 10.17632/CP3473X7XV.3.
Google Scholar
16