Dynamic feature capturing in a fluid flow reduced-order model using attention-augmented autoencoders

This study looks into how adding adaptive attention to convolutional autoencoders can help reconstruct flow fields in fluid dynamics applications. The study compares the effectiveness of the proposed adaptive attention mechanism with the convolutional block attention module approach using two differ...

Full description

Saved in:
Bibliographic Details
Published in:Engineering applications of artificial intelligence Vol. 149; p. 110463
Main Authors: Beiki, Alireza, Kamali, Reza
Format: Journal Article
Language:English
Published: Elsevier Ltd 01.06.2025
Subjects:
ISSN:0952-1976
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study looks into how adding adaptive attention to convolutional autoencoders can help reconstruct flow fields in fluid dynamics applications. The study compares the effectiveness of the proposed adaptive attention mechanism with the convolutional block attention module approach using two different sets of datasets. The analysis encompasses the evaluation of reconstruction loss, latent space characteristics, and the application of attention mechanisms to time series forecasting. Combining adaptive attention with involution layers enhances its ability to identify and highlight significant features, surpassing the capabilities of the convolutional block attention module. This result demonstrates an increase of over 20% in the accuracy of reconstruction. Latent space analysis shows the adaptive attention mechanism’s complex and flexible encoding, which makes it easier for the model to represent different types of data. The study also looks at how attention works and how it affects time series forecasting. It shows that a new method that combines multi-head attention and bidirectional long-short-term memory works well for forecasting over 5 s of futures of flow fields. This research provides valuable insights into the role of attention mechanisms in improving model accuracy, generalization, and forecasting capabilities in the field of fluid dynamics. [Display omitted] •Adaptive attention boosts flow field reconstruction.•Involution layers enhance latent space adaptability.•Multi-head attention improves time-series forecasting.•Attention mechanisms refine latent space representation.•Dataset-specific impact on clustering with attention.
ISSN:0952-1976
DOI:10.1016/j.engappai.2025.110463