This study addresses the crucial challenge of monitoring the State of Health (SOH) of Lithium- Ion Batteries (LIBs) in response to the escalating demand for renewable energy systems and the imperative to reduce CO2 emissions. The research introduces deep learning (DL) models, namely Encoder-Long Short-Term Memory (E-LSTM) and Convolutional Neural Network-LSTM (CNN-LSTM), each designed to forecast battery SOH. E-LSTM integrates an encoder for dimensionality reduction and a LSTM model to capture data dependencies. CNN-LSTM, on the other hand, employs CNN layers for encoding followed by LSTM layers for precise SOH estimation. Significantly, we prioritize model explanability by employing a game-theoretic approach known as SHapley Additive exPlanations (Shap) to elucidate the output of our models. Furthermore, a method based on pattern mining was developed, synergizing with the model, to identify patterns contributing to abnormal SOH decrease. These insights are presented through informative plots. The proposed approach relies on the battery dataset from the Massachusetts Institute of Technology (MIT) and showcases promising results in accurately predicting SOH values, in which the E-LSTM model outperformed the CNN-LSTM model with a Mean Absolute Error (MAE) of less than 1%.
This repo contains code for the paper: Data-Driven Strategy for State of Health Prediction and Anomaly Detection in Lithium-Ion Batteries
@article{slimane2024,
title={Data-Driven Strategy for State of Health Prediction and Anomaly Detection in Lithium-Ion
Batteries},
author={Slimane Arbaoui, Ahmed Samet, Ali Ayadi, Tedjani Mesbahi, Romuald Boné},
journal={Energy and AI},
year={2024}
}
- python>=3.10.10
- tensorflow==2.11.1
- keras==2.11.0
- h5py==3.7.0
-
Download the following three files from the provided link MIT:
- '2017-05-12_batchdata_updated_struct_errorcorrect.mat'
- '2017-06-30_batchdata_updated_struct_errorcorrect.mat'
- '2018-04-12_batchdata_updated_struct_errorcorrect.mat'
-
Execute the 'loading_data_MIT.ipynb' notebook, ensuring that the 'path_to_file' variable is set to the repository containing the downloaded files.
Execute the 'data_preparing_MIT.ipynb' notebook to create an encoder for your data. This step is crucial for data preprocessing and dimensionality reduction.
Execute the 'model_generating_MIT.ipynb' notebook to train the E-LSTM and CNN-LSTM models. After completing this step, run the 'combined_model_MIT.ipynb' notebook to integrate the LSTM model with the encoder.
To gain insights into model predictions and enhance interpretability, execute the 'SHap_explaining.ipynb' notebook to calculate SHapley Additive exPlanations (Shap) values.
In this phase, we use the E-LSTM model to detect abnormal decreases in SOH. Follow the instructions in the 'Abnormal_SOH_Detection.ipynb' notebook. This notebook will leverage pattern mining techniques to identify and visualize patterns contributing to abnormal SOH deterioration.
Table 1: E-LSTM performance results, with MAE, MSE, RMSE, and MAPE.
(10,10) | (25,25) | (25,50) | ||||
---|---|---|---|---|---|---|
Metric | Mean | Std | Mean | Std | Mean | Std |
MAE (10^{-2}) | 0.86 | 0.06 | 0.83 | 0.07 | 0.89 | 0.05 |
MSE (10^{-3}) | 0.17 | 0.02 | 0.16 | 0.02 | 0.19 | 0.02 |
RMSE (10^{-2}) | 1.30 | 0.44 | 1.26 | 0.44 | 1.37 | 0.44 |
MAPE | 0.91 | 0.06 | 0.89 | 0.07 | 0.95 | 0.05 |
(10,10) | (25,25) | (25,50) | ||||
---|---|---|---|---|---|---|
Metric | Mean | Std | Mean | Std | Mean | Std |
MAE (10^{-2}) | 0.90 | 0.08 | 1.11 | 0.23 | 1.19 | 0.18 |
MSE (10^{-3}) | 0.22 | 0.03 | 0.35 | 0.12 | 0.35 | 0.09 |
RMSE 10^{-2}) | 1.40 | 0.54 | 1.87 | 1.07 | 1.87 | 0.94 |
MAPE | 0.96 | 0.09 | 1.25 | 0.24 | 1.27 | 0.20 |
Figure 2: The average contribution of each feature in the model’s prediction using E-LSTM.
Figure 1: SOH prediction using 10-step input window for 10- step output window.
Figure 2: The average contribution of each feature in the model’s prediction using E-LSTM.
If you have any issues or questions about this repo, feel free to contact slimane.arbaoui@insa-strasbourg.fr.