An ensemble of differential evolution and Adam for training feed-forward neural networks
Adam is an adaptive gradient descent approach that is commonly used in back-propagation (BP) algorithms for training feed-forward neural networks (FFNNs). However, it has the defect that it may easily fall into local optima. To solve this problem, some metaheuristic approaches have been proposed to...
Saved in:
| Published in: | Information sciences Vol. 608; pp. 453 - 471 |
|---|---|
| Main Authors: | , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
Elsevier Inc
01.08.2022
|
| Subjects: | |
| ISSN: | 0020-0255, 1872-6291 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Be the first to leave a comment!