Visual prediction method based on time series-driven LSTM model

Significant progress has been made in time series prediction and image processing problems. However, most of the studies have focused on either the field of time series or image processing separately, failing to integrate the advantages of both fields. To overcome the limitations of existing algorit...

Full description

Saved in:
Bibliographic Details
Published in:Scientific reports Vol. 15; no. 1; pp. 38057 - 14
Main Authors: Jumahong, Huxidan, Wang, Yongjie, Aili, Abuduwaili, Wang, Weina
Format: Journal Article
Language:English
Published: London Nature Publishing Group UK 30.10.2025
Nature Publishing Group
Nature Portfolio
Subjects:
ISSN:2045-2322, 2045-2322
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Significant progress has been made in time series prediction and image processing problems. However, most of the studies have focused on either the field of time series or image processing separately, failing to integrate the advantages of both fields. To overcome the limitations of existing algorithms in image temporal inference, this paper proposes a novel visual prediction framework based on the time series forecasting model, which can predict single-frame or multi-frame images by thoroughly analyzing their spatio-temporal features. Firstly, the ViT image feature extraction module is constructed by randomly masking and reconstructing the image to analyze the learned image and extract the features. Then, the time series construction module is designed to convert the extracted features into the time series model suitable for the LSTM network. Finally, the time series data is predicted based on LSTM, and the predicted time series data is transformed into the predicted image. A series of experiments is done on three types of cloud image datasets. The results are analyzed superficially and demonstrate the effectiveness and feasibility of the proposed method in terms of image prediction performance.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-025-21911-9