GCN-Based LSTM Autoencoder with Self-Attention for Bearing Fault Diagnosis

The manufacturing industry has been operating within a constantly evolving technological environment, underscoring the importance of maintaining the efficiency and reliability of manufacturing processes. Motor-related failures, especially bearing defects, are common and serious issues in manufacturi...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Sensors (Basel, Switzerland) Ročník 24; číslo 15; s. 4855
Hlavní autori: Lee, Daehee, Choo, Hyunseung, Jeong, Jongpil
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Switzerland MDPI AG 01.08.2024
Predmet:
ISSN:1424-8220, 1424-8220
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:The manufacturing industry has been operating within a constantly evolving technological environment, underscoring the importance of maintaining the efficiency and reliability of manufacturing processes. Motor-related failures, especially bearing defects, are common and serious issues in manufacturing processes. Bearings provide accurate and smooth movements and play essential roles in mechanical equipment with shafts. Given their importance, bearing failure diagnosis has been extensively studied. However, the imbalance in failure data and the complexity of time series data make diagnosis challenging. Conventional AI models (convolutional neural networks (CNNs), long short-term memory (LSTM), support vector machine (SVM), and extreme gradient boosting (XGBoost)) face limitations in diagnosing such failures. To address this problem, this paper proposes a bearing failure diagnosis model using a graph convolution network (GCN)-based LSTM autoencoder with self-attention. The model was trained on data extracted from the Case Western Reserve University (CWRU) dataset and a fault simulator testbed. The proposed model achieved 97.3% accuracy on the CWRU dataset and 99.9% accuracy on the fault simulator dataset.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s24154855