Comparison of Particle Swarm Optimization Algorithms in Hyperparameter Optimization Problem of Multi Layered Perceptron

This paper describes the application of particle swarm optimization (PSO) for the hyperparameter optimization problem of multi-layered perceptron (MLP) model. Several PSO algorithms are presented by many researchers; basic PSO, PSO with inertia weight (PSO-w), PSO with constriction factor (PSO-cf),...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Computer assisted methods in engineering and science Ročník 32; číslo 1
Hlavní autori: Kenta Shiomi, Tetsuya Sato, Eisuke Kita
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Institute of Fundamental Technological Research Polish Academy of Sciences 2025
Predmet:
ISSN:2299-3649, 2956-5839
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:This paper describes the application of particle swarm optimization (PSO) for the hyperparameter optimization problem of multi-layered perceptron (MLP) model. Several PSO algorithms are presented by many researchers; basic PSO, PSO with inertia weight (PSO-w), PSO with constriction factor (PSO-cf), local PSO-w, local PSO-cf, union of local and global PSOs (UPSO), PSO with second global best particle (SG-PSO), and PSO with second local best particle (SP-PSO). The wine dataset is taken as a numerical example and hyperparameters of MLP the model are determined by the above-mentioned PSO algorithms. The sets of hyperparameters determined by these PSO algorithms are compared with the results of the traditional algorithms for hyperparameter optimization such as random search, tree-structured Parzen estimator (TPE), and covariance matrix adaptation evolution strategy (CMA-ES). Numerical results indicate that PSO-cf is the best-performing and local PSO-w is the second best among the PSO algorithms. The sets of hyperparameters determined by the PSO algorithms were relatively similar. An important finding from the numerical results is that PSO algorithms could find better hyperparameters than random search, TPE, and CMA-ES. This demonstrates that PSO is suitable for the hyperparameter optimization problem in MLP models.
ISSN:2299-3649
2956-5839
DOI:10.24423/cames.2025.1730