Comparison between Adam, AdaMax and Adam W optimizers to implement a Weather Forecast based on Neural Networks for the Andean city of Quito

The main function of an optimizer is to determine in what measure to change the weights and the learning rate of the neural network to reduce losses. One of the best known optimizers is Adam, which main advantage is the invariance of the magnitudes of the parameter updates with respect to the change...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2021 IEEE Fifth Ecuador Technical Chapters Meeting (ETCM) s. 1 - 6
Hlavní autoři: Llugsi, Ricardo, Yacoubi, Samira El, Fontaine, Allyx, Lupera, Pablo
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 12.10.2021
Témata:
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The main function of an optimizer is to determine in what measure to change the weights and the learning rate of the neural network to reduce losses. One of the best known optimizers is Adam, which main advantage is the invariance of the magnitudes of the parameter updates with respect to the change of scale of the gradient. However, other optimizers are often chosen because they generalize in a better manner. AdamW is a variant of Adam where the weight decay is performed only after controlling the parameter-wise step size. In order to present a comparative scenario for optimizers in the present work, a Temperature Forecast for the Andean city of Quito using a neural network structure with uncertainty reduction was implemented and three optimizers (Adam, AdaMax and AdamW) were analyzed. In order to do the comparison three error metrics were obtained per hour in order to determine the effectiveness of the prediction. From the analysis it can be seen that Adam and AdaMax behave similarly reaching a maximum MSE per hour of 2.5°C nevertheless AdamW allows to reduce this error around 1.3°C.
DOI:10.1109/ETCM53643.2021.9590681