Search

Searching. Please wait…

Using time series foundation models for few-shot remaining useful life prediction of aircraft engines

Abstract: Predictive maintenance often involves imbalanced multivariate time series datasets with scarce failure events, posing challenges for model training due to the high dimensionality of the data and the need for domain-specific preprocessing, which frequently leads to the development of large and complex models. Inspired by the success of Large Language Models (LLMs), transformer-based foundation models have been developed for time series (TSFM). These models have been proven to reconstruct time series in a zero-shot manner, being able to capture different patterns that effectively characterize time series. This paper proposes the use of TSFM to generate embeddings of the input data space, making them more interpretable for machine learning models. To evaluate the effectiveness of our approach, we trained three classical machine learning algorithms and one neural network using the embeddings generated by the TSFM called Moment for predicting the remaining useful life of aircraft engines. We test the models trained with both the full training dataset and only 10% of the training samples. Our results show that training simple models, such as support vector regressors or neural networks, with embeddings generated by Moment not only accelerates the training process but also enhances performance in few-shot learning scenarios, where data is scarce. This suggests a promising alternative to complex deep learning architectures, particularly in industrial contexts with limited labeled data.

 Authorship: Dintén R., Zorrilla M.,

 Fuente: Computer Modeling in Engineering & Sciences, 2025, 144(1), 239-265

 Publisher: Tech Science Press

 Publication date: 01/07/2025

 No. of pages: 27

 Publication type: Article

 DOI: 10.32604/cmes.2025.065461

 ISSN: 1526-1492,1526-1506

 Spanish project: PID2021-124502OB-C42

 Publication Url: https://doi.org/10.32604/cmes.2025.065461