Spatio-Temporal Network for Sea Fog Forecasting

Sea fog can seriously affect schedules and safety by reducing visibility during marine transportation. Therefore, the forecasting of sea fog is an important issue in preventing accidents. Recently, in order to forecast sea fog, several deep learning methods have been applied to time series data cons...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Sustainability Ročník 14; číslo 23; s. 16163
Hlavní autori: Park, Jinhyeok, Lee, Young Jae, Jo, Yongwon, Kim, Jaehoon, Han, Jin Hyun, Kim, Kuk Jin, Kim, Young Taeg, Kim, Seoung Bum
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Basel MDPI AG 01.12.2022
Predmet:
ISSN:2071-1050, 2071-1050
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Sea fog can seriously affect schedules and safety by reducing visibility during marine transportation. Therefore, the forecasting of sea fog is an important issue in preventing accidents. Recently, in order to forecast sea fog, several deep learning methods have been applied to time series data consisting of meteorological and oceanographic observations or image data to predict fog. However, these methods only use a single image without considering meteorological and temporal characteristics. In this study, we propose a multi-modal learning method to improve the forecasting accuracy of sea fog using convolutional neural network (CNN) and gated recurrent unit (GRU) models. CNN and GRU extract useful features from closed-circuit television (CCTV) images and multivariate time series data, respectively. CCTV images and time series data collected at Daesan Port in South Korea from 1 March 2018 to 14 February 2021 by Korea Hydrographic and Oceanographic Agency (KHOA) were used to evaluate the proposed method. We compare the proposed method with deep learning methods that only consider temporal information or spatial information. The results indicate that the proposed method using both temporal and spatial information at the same time shows superior accuracy.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2071-1050
2071-1050
DOI:10.3390/su142316163