Multiobjective Hooke–Jeeves algorithm with a stochastic Newton–Raphson-like step-size method

•The multiobjective version of Hooke–Jeeves (MOJH) algorithm is proposed.•New step size methodology from Newton–Raphson (NR) Algorithm is integrated on (MOJH).•The performance of MOJH is improved with NR-based stochastic step size method. Computational optimization algorithms are focused on the impr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications Jg. 117; S. 166 - 175
Hauptverfasser: Altinoz, O. Tolga, Yilmaz, A. Egemen
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York Elsevier Ltd 01.03.2019
Elsevier BV
Schlagworte:
ISSN:0957-4174, 1873-6793
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•The multiobjective version of Hooke–Jeeves (MOJH) algorithm is proposed.•New step size methodology from Newton–Raphson (NR) Algorithm is integrated on (MOJH).•The performance of MOJH is improved with NR-based stochastic step size method. Computational optimization algorithms are focused on the improvement of meta-heuristic algorithms in a way that they can able to handle problems with more than one objective; such improved algorithms are called multiobjective optimization algorithms. As the number of objectives is increased, the complexity of the algorithm is increased with respect to the computational cost. Because classical optimization algorithms follow the direction of descending values by calculating derivations of the function, it is possible to evaluate a classical optimization algorithm as the core of a novel multiobjective optimization algorithm. Among the classical optimization algorithms, in this study, the Hooke–Jeeves (HJ) algorithm is selected as the basis of the proposed multiobjective optimization algorithm, in which members of the proposed population-based HJ algorithm move to the Pareto front by checking two neighborhood solutions at each dimension, with a dynamic distance that is calculated by using the Newton–Raphson-like stochastic step-size method. Unlike various multiobjective optimization algorithms, the performance of the proposed algorithm is greatly dependent on the decision space dimension instead of the number of objectives. As the number of objectives increases without changing the decision dimension, the computational cost almost remains the same. In addition, the proposed algorithm can be applied to single, multiple and many objective optimization problems. In this study, initially, the behaviors of the HJ and proposed multiobjective HJ algorithms are evaluated by theoretical and graphical demonstrations. Next, the performance of the proposed method is evaluated on well-known benchmark problems, and the performance of this algorithm is compared with the Nondominated Sorting Genetic Algorithm-II (NSGA-II) algorithm by using three different metric calculations. Finally, the algorithm is applied to many-objective optimization problems, and the performance of the proposed algorithm is evaluated based on the obtained results.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2018.09.033