An ensemble of differential evolution and Adam for training feed-forward neural networks

Adam is an adaptive gradient descent approach that is commonly used in back-propagation (BP) algorithms for training feed-forward neural networks (FFNNs). However, it has the defect that it may easily fall into local optima. To solve this problem, some metaheuristic approaches have been proposed to...

Full description

Saved in:
Bibliographic Details
Published in:Information sciences Vol. 608; pp. 453 - 471
Main Authors: Xue, Yu, Tong, Yiling, Neri, Ferrante
Format: Journal Article
Language:English
Published: Elsevier Inc 01.08.2022
Subjects:
ISSN:0020-0255, 1872-6291
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Be the first to leave a comment!
You must be logged in first