Missing-Insensitive Short-Term Load Forecasting Leveraging Autoencoder and LSTM
In most deep learning-based load forecasting, an intact dataset is required. Since many real-world datasets contain missing values for various reasons, missing imputation using deep learning is actively studied. However, missing imputation and load forecasting have been considered independently so f...
Uložené v:
| Vydané v: | IEEE access Ročník 8; s. 206039 - 206048 |
|---|---|
| Hlavní autori: | , , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Piscataway
IEEE
2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Predmet: | |
| ISSN: | 2169-3536, 2169-3536 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | In most deep learning-based load forecasting, an intact dataset is required. Since many real-world datasets contain missing values for various reasons, missing imputation using deep learning is actively studied. However, missing imputation and load forecasting have been considered independently so far. In this article, we provide a deep learning framework that jointly considers missing imputation and load forecasting. We consider a family of autoencoder/long short-term memory (LSTM) combined models for missing-insensitive load forecasting. Specifically, autoencoder (AE), denoising autoencoder (DAE), convolutional autoencoder (CAE), and denoising convolutional autoencoder (DCAE) are considered for extracting features, of which the encoded outputs are fed into the input of LSTM. Our experiments show that the proposed DCAE/LSTM combined model significantly improves forecasting accuracy no matter what missing rate or type (random missing, consecutive block missing) occurs compared to the baseline LSTM. |
|---|---|
| Bibliografia: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 2169-3536 2169-3536 |
| DOI: | 10.1109/ACCESS.2020.3036885 |