Optimization of quantum-inspired neural network using memetic algorithm for function approximation and chaotic time series prediction

•A novel memetic algorithm based on hybrid genetic algorithm and gradient descent is proposed.•We develop a new and efficient type of quantum-inspired neural networks model.•The accuracy of the approach is investigated for function approximation and time series prediction problems.•Numerical experim...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neurocomputing (Amsterdam) Jg. 291; S. 175 - 186
Hauptverfasser: Ganjefar, Soheil, Tofighi, Morteza
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Elsevier B.V 24.05.2018
Schlagworte:
ISSN:0925-2312, 1872-8286
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•A novel memetic algorithm based on hybrid genetic algorithm and gradient descent is proposed.•We develop a new and efficient type of quantum-inspired neural networks model.•The accuracy of the approach is investigated for function approximation and time series prediction problems.•Numerical experiments show the excellent effectiveness and efficiency of the proposed approach. Heuristic and deterministic optimization methods are extensively applied for the training of artificial neural networks. Both of these methods have their own advantages and disadvantages. Heuristic stochastic optimization methods like genetic algorithm perform global search, but they suffer from the problem of slow convergence rate near global optimum. On the other hand deterministic methods like gradient descent exhibit a fast convergence rate around global optimum but may get stuck in a local optimum. Motivated by these problems, a hybrid learning algorithm combining genetic algorithm (GA) with gradient descent (GD), called HGAGD, is proposed in this paper. The new algorithm combines the global exploration ability of GA with the accurate local exploitation ability of GD to achieve a faster convergence and also a better accuracy of final solution. The HGAGD is then employed as a new training method to optimize the parameters of a quantum-inspired neural network (QINN) for two different applications. Firstly, two benchmark functions are chosen to demonstrate the potential of the proposed QINN with the HGAGD algorithm in dealing with function approximation problems. Next, the performance of the proposed method in forecasting Mackey–Glass time series and Lorenz attractor is studied. The results of these studies show the superiority of the introduced approach over other published approaches.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2018.02.074