FastDTW-Encoded Spatial-temporal Attention Dual Graph Convolutional Network for Traffic Flow Prediction

Addressing the insufficient extraction of traffic network topology features and inadequate topological feature extraction and spatial-temporal dependency modeling in traffic flow prediction, we construct a dual graph convolutional network algorithm based on fast Dynamic Time Warping (fastDTW) and sp...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE International Symposium on IT in Medicine and Education s. 720 - 725
Hlavní autoři: Shen, Bingqi, Chen, Linlong, Yang, Nan
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 13.09.2024
Témata:
ISSN:2474-3828
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Addressing the insufficient extraction of traffic network topology features and inadequate topological feature extraction and spatial-temporal dependency modeling in traffic flow prediction, we construct a dual graph convolutional network algorithm based on fast Dynamic Time Warping (fastDTW) and spatial-temporal attention mechanism (STADGCN). Firstly, the spatial-temporal attention module is employed to capture the dynamic influence weights of the spatial-temporal dimensions. Secondly, fastDTW is utilized to measure similarity between nodes in the traffic network, enhancing topology-based feature extraction through adjacency matrix encoding. Subsequently, dual graph convolutional and temporal convolutional networks are constructed to algorithm spatial-temporal dependencies. Finally, the prediction performance of the STADGCN algorithm is verified by a weighted fusion of recent, daily, and weekly components based on real highway network detector data. Experimental results demonstrate that compared to ARIMA, VAR, FNN, GAT, GCN, GWNet, STGCN, and ASTGCN, STADGCN exhibits superior performance with MAPE reductions of 81.97%, 64.52%, 78.85%, 69.44%, 54.17%, 8.33%, 8.33%, and 26.67% respectively, on the pems08 dataset.
ISSN:2474-3828
DOI:10.1109/ITME63426.2024.00146