Deep Nonlinear Dynamic Feature Extraction for Quality Prediction Based on Spatiotemporal Neighborhood Preserving SAE

Complex industrial process data often exhibit nonlinear static and dynamic characteristics. Traditional deep learning methods such as stacked autoencoder (SAE) have excellent nonlinear static feature learning capabilities, but they ignore the dynamic correlation existing in process data. Feature lea...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on instrumentation and measurement Ročník 70; s. 1 - 10
Hlavní autori: Liu, Chenliang, Wang, Kai, Wang, Yalin, Xie, Shengli, Yang, Chunhua
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:0018-9456, 1557-9662
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Complex industrial process data often exhibit nonlinear static and dynamic characteristics. Traditional deep learning methods such as stacked autoencoder (SAE) have excellent nonlinear static feature learning capabilities, but they ignore the dynamic correlation existing in process data. Feature learning based on manifold learning using neighborhood structure preserving has been widely used in industrial dynamic process monitoring. However, most of the manifold learning methods extract linear features, and complex nonlinearities in process data are ignored. Therefore, a novel spatiotemporal neighborhood preserving stack autoencoder (STNP-SAE) is proposed to simultaneously learn deep nonlinear static and dynamic features of process data in this article. By constructing the spatial and temporal adjacent graphs, STNP-SAE can capture the spatiotemporal neighborhood structure information of process data during the feature learning process. Then, STNP-SAE is used to construct a soft sensor framework for quality prediction. The prediction performance of the proposed method is validated on a practical industrial process.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2021.3122187