CESI
Stage recherche
Job Location
Nanterre, France
Job Description
In the evolving landscape of AI, Attention LSTMs are becoming critical tools for enhancing the explainability of deep learning models. While deep learning approaches, such as LSTMs (Long Short-Term Memory networks), achieve high prognostic performance, they often come with significant computational demands and limited interpretability. Standard feature extraction models sometimes exacerbate this issue by losing information during training, leading to poorer predictions, especially in Remaining Useful Life (RUL) forecasting tasks.
Attention mechanisms integrated into LSTM models provide a solution by allowing the network to focus on the most relevant parts of the input sequence. Attention helps identify key time steps or data points that are most influential in predictions, which is particularly valuable in applications like time-series analysis and predictive maintenance.
Incorporating hyperparameter optimization further enhances model performance. In our previous work, we introduced BootBOGS [1], a hybrid hyperparameter optimization technique that combines bootstrap resampling, Bayesian Optimization, and Grid Search. This method was designed to efficiently search hyperparameter spaces, reducing variance in model performance and refining the hyperparameter ranges for better accuracy.
The goal of this internship is to integrate BootBOGS into an Attention-enhanced LSTM for RUL prediction. You will tune various parameters, including the number of LSTM units, dropout rates to mitigate overfitting, attention head sizes (in multi-head attention mechanisms), and learning rates. By optimizing these parameters, the performance and interpretability of the model will be significantly improved, allowing it to focus on the most critical features of the input sequence. This enhances the model's ability to explain predictions, an important factor in high-stakes applications like predictive maintenance.
23/10/2024
2
KEY OBJECTIVES
1. Incorporate BootBOGS into the LSTM framework for enhanced hyperparameter tuning.
2. Perform computational experiments using the NASA C-MAPSS datasets to evaluate model effectiveness.
3. Focus on improving both the performance and explainability of RUL prediction
Location: Nanterre, FR
Posted Date: 11/15/2024
Attention mechanisms integrated into LSTM models provide a solution by allowing the network to focus on the most relevant parts of the input sequence. Attention helps identify key time steps or data points that are most influential in predictions, which is particularly valuable in applications like time-series analysis and predictive maintenance.
Incorporating hyperparameter optimization further enhances model performance. In our previous work, we introduced BootBOGS [1], a hybrid hyperparameter optimization technique that combines bootstrap resampling, Bayesian Optimization, and Grid Search. This method was designed to efficiently search hyperparameter spaces, reducing variance in model performance and refining the hyperparameter ranges for better accuracy.
The goal of this internship is to integrate BootBOGS into an Attention-enhanced LSTM for RUL prediction. You will tune various parameters, including the number of LSTM units, dropout rates to mitigate overfitting, attention head sizes (in multi-head attention mechanisms), and learning rates. By optimizing these parameters, the performance and interpretability of the model will be significantly improved, allowing it to focus on the most critical features of the input sequence. This enhances the model's ability to explain predictions, an important factor in high-stakes applications like predictive maintenance.
23/10/2024
2
KEY OBJECTIVES
1. Incorporate BootBOGS into the LSTM framework for enhanced hyperparameter tuning.
2. Perform computational experiments using the NASA C-MAPSS datasets to evaluate model effectiveness.
3. Focus on improving both the performance and explainability of RUL prediction
Location: Nanterre, FR
Posted Date: 11/15/2024
Contact Information
Contact | Human Resources CESI |
---|