Visual prediction method based on time series-driven LSTM model
Significant progress has been made in time series prediction and image processing problems. However, most of the studies have focused on either the field of time series or image processing separately, failing to integrate the advantages of both fields. To overcome the limitations of existing algorit...
Uložené v:
| Vydané v: | Scientific reports Ročník 15; číslo 1; s. 38057 - 14 |
|---|---|
| Hlavní autori: | , , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
London
Nature Publishing Group UK
30.10.2025
Nature Publishing Group Nature Portfolio |
| Predmet: | |
| ISSN: | 2045-2322, 2045-2322 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | Significant progress has been made in time series prediction and image processing problems. However, most of the studies have focused on either the field of time series or image processing separately, failing to integrate the advantages of both fields. To overcome the limitations of existing algorithms in image temporal inference, this paper proposes a novel visual prediction framework based on the time series forecasting model, which can predict single-frame or multi-frame images by thoroughly analyzing their spatio-temporal features. Firstly, the ViT image feature extraction module is constructed by randomly masking and reconstructing the image to analyze the learned image and extract the features. Then, the time series construction module is designed to convert the extracted features into the time series model suitable for the LSTM network. Finally, the time series data is predicted based on LSTM, and the predicted time series data is transformed into the predicted image. A series of experiments is done on three types of cloud image datasets. The results are analyzed superficially and demonstrate the effectiveness and feasibility of the proposed method in terms of image prediction performance. |
|---|---|
| Bibliografia: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ISSN: | 2045-2322 2045-2322 |
| DOI: | 10.1038/s41598-025-21911-9 |