A Data-Driven Method Based on Bidirectional Convolutional Current Neural Network to Detect Structural Damage

In addressing the challenge of lacking precise mathematical and mechanical models, as well as data labels in structural damage identification, this paper proposes an innovative approach that eliminates the need for such models and labels. Initially, raw acceleration responses serve as input data, wh...

Full description

Saved in:
Bibliographic Details
Published in:Iranian journal of science and technology. Transactions of civil engineering Vol. 49; no. 1; pp. 579 - 595
Main Authors: Xue, Songling, Su, Teng, Xie, Qinghai, Zhao, Xiaoqing, Zong, Zhongling
Format: Journal Article
Language:English
Published: Cham Springer International Publishing 01.02.2025
Springer Nature B.V
Subjects:
ISSN:2228-6160, 2364-1843
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In addressing the challenge of lacking precise mathematical and mechanical models, as well as data labels in structural damage identification, this paper proposes an innovative approach that eliminates the need for such models and labels. Initially, raw acceleration responses serve as input data, which undergo data preprocessing to derive training samples and labels. Subsequently, these training samples and labels are fed into the bidirectional convolutional recurrent neural network constructed in this study for parameter optimization through training. Finally, synthesized acceleration signals are input into the trained network to obtain predictive signals, and a custom-defined damage index is employed to compute structural damage. The applicability of this methodology is validated through numerical simulations and experimental study. The research findings demonstrate that the proposed approach is an unsupervised, data-driven method capable of identifying dynamic structural damage without reliance on the presence of structural labels or precise computational models.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2228-6160
2364-1843
DOI:10.1007/s40996-024-01427-4