Abstract
Remaining useful life (RUL) prediction of lithium-ion batteries can assess the battery reliability to determine the advent of failure and mitigate battery risk. The existing RUL prediction techniques for lithium-ion batteries are inefficient for learning the long-term dependencies among the capacity degradations. This paper investigates deep-learning-enabled battery RUL prediction. The long short-term memory (LSTM) recurrent neural network (RNN) is employed to learn the long-term dependencies among the degraded capacities of lithium-ion batteries. The LSTM RNN is adaptively optimized using the resilient mean square back-propagation method, and a dropout technique is used to address the overfitting problem. The developed LSTM RNN is able to capture the underlying long-term dependencies among the degraded capacities and construct an explicitly capacity-oriented RUL predictor, whose long-term learning performance is contrasted to the support vector machine model, the particle filter model, and the simple RNN model. Monte Carlo simulation is combined to generate a probabilistic RUL prediction. Experimental data from multiple lithium-ion cells at two different temperatures is deployed for model construction, verification, and comparison. The developed method is able to predict the battery's RUL independent of offline training data, and when some offline data is available, the RUL can be predicted earlier than in the traditional methods.
Keywords
Affiliated Institutions
Related Publications
A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures
Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cel...
Recurrent nets that time and count
The size of the time intervals between events conveys information essential for numerous sequential tasks such as motor control and rhythm detection. While hidden Markov models ...
Learning to forget: continual prediction with LSTM
Long short-term memory (LSTM) can solve many tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks pro...
Recurrent Neural Network Regularization
We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique for regularizi...
Training Recurrent Networks by Evolino
In recent years, gradient-based LSTM recurrent neural networks (RNNs) solved many previously RNN-unlearnable tasks. Sometimes, however, gradient information is of little use for...
Publication Info
- Year
- 2018
- Type
- article
- Volume
- 67
- Issue
- 7
- Pages
- 5695-5705
- Citations
- 1150
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/tvt.2018.2805189