Deep Learning Surrogate Models of JULES-INFERNO for Wildfire Prediction on a Global Scale

Gespeichert in:
Bibliographische Detailangaben
Titel: Deep Learning Surrogate Models of JULES-INFERNO for Wildfire Prediction on a Global Scale
Autoren: Sibo Cheng, Hector Chassagnon, Matthew Kasoar, Yike Guo, Rossella Arcucci
Quelle: IEEE Transactions on Emerging Topics in Computational Intelligence. 9:444-454
Publication Status: Preprint
Verlagsinformationen: Institute of Electrical and Electronics Engineers (IEEE), 2025.
Publikationsjahr: 2025
Schlagwörter: FOS: Computer and information sciences, Mathematical models, Computer Science - Machine Learning, Numerical models, Computer Science - Artificial Intelligence, Data models, Computational modeling, Wildfire, 15. Life on land, Surrogate modelling, 7. Clean energy, Long short term memory, Machine Learning (cs.LG), Predictive models, Atmospheric modeling, Artificial Intelligence (cs.AI), 13. Climate action, ConvLSTM, Machine learning
Beschreibung: Global wildfire models play a crucial role in anticipating and responding to changing wildfire regimes. JULES-INFERNO is a global vegetation and fire model simulating wildfire emissions and area burnt on a global scale. However, because of the high data dimensionality and system complexity, JULES-INFERNO's computational costs make it challenging to apply to fire risk forecasting with unseen initial conditions. Typically, running JULES-INFERNO for 30 years of prediction will take several hours on High Performance Computing (HPC) clusters. To tackle this bottleneck, two data-driven models are built in this work based on Deep Learning techniques to surrogate the JULES-INFERNO model and speed up global wildfire forecasting. More precisely, these machine learning models take global temperature, vegetation density, soil moisture and previous forecasts as inputs to predict the subsequent global area burnt on an iterative basis. Average Error per Pixel (AEP) and Structural Similarity Index Measure (SSIM) are used as metrics to evaluate the performance of the proposed surrogate models. A fine tuning strategy is also proposed in this work to improve the algorithm performance for unseen scenarios. Numerical results show a strong performance of the proposed models, in terms of both computational efficiency (less than 20 seconds for 30 years of prediction on a laptop CPU) and prediction accuracy (with AEP under 0.3\% and SSIM over 98\% compared to the outputs of JULES-INFERNO).
Publikationsart: Article
ISSN: 2471-285X
DOI: 10.1109/tetci.2024.3445450
DOI: 10.48550/arxiv.2409.00237
Zugangs-URL: http://arxiv.org/abs/2409.00237
Rights: IEEE Copyright
CC BY NC SA
Dokumentencode: edsair.doi.dedup.....26720e8ac8ce4cda4cdb3a2826525c2c
Datenbank: OpenAIRE
Beschreibung
Abstract:Global wildfire models play a crucial role in anticipating and responding to changing wildfire regimes. JULES-INFERNO is a global vegetation and fire model simulating wildfire emissions and area burnt on a global scale. However, because of the high data dimensionality and system complexity, JULES-INFERNO's computational costs make it challenging to apply to fire risk forecasting with unseen initial conditions. Typically, running JULES-INFERNO for 30 years of prediction will take several hours on High Performance Computing (HPC) clusters. To tackle this bottleneck, two data-driven models are built in this work based on Deep Learning techniques to surrogate the JULES-INFERNO model and speed up global wildfire forecasting. More precisely, these machine learning models take global temperature, vegetation density, soil moisture and previous forecasts as inputs to predict the subsequent global area burnt on an iterative basis. Average Error per Pixel (AEP) and Structural Similarity Index Measure (SSIM) are used as metrics to evaluate the performance of the proposed surrogate models. A fine tuning strategy is also proposed in this work to improve the algorithm performance for unseen scenarios. Numerical results show a strong performance of the proposed models, in terms of both computational efficiency (less than 20 seconds for 30 years of prediction on a laptop CPU) and prediction accuracy (with AEP under 0.3\% and SSIM over 98\% compared to the outputs of JULES-INFERNO).
ISSN:2471285X
DOI:10.1109/tetci.2024.3445450