Grey wolf optimizer with self-repulsion strategy for feature selection
Feature selection is one of the most critical steps in big data analysis. Accurately extracting correct features from massive data can effectively improve the accuracy of big data processing algorithms. However, traditional grey wolf optimizer (GWO) algorithms often suffer from slow convergence and...
Uloženo v:
| Vydáno v: | Scientific reports Ročník 15; číslo 1; s. 12807 - 24 |
|---|---|
| Hlavní autoři: | , , , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
London
Nature Publishing Group UK
14.04.2025
Nature Publishing Group Nature Portfolio |
| Témata: | |
| ISSN: | 2045-2322, 2045-2322 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Feature selection is one of the most critical steps in big data analysis. Accurately extracting correct features from massive data can effectively improve the accuracy of big data processing algorithms. However, traditional grey wolf optimizer (GWO) algorithms often suffer from slow convergence and a tendency to fall into local optima, limiting their effectiveness in high-dimensional feature selection tasks. To address these limitations, we propose a novel feature selection algorithm called grey wolf optimizer with self-repulsion strategy (GWO-SRS). In GWO-SRS, the hierarchical structure of the wolf pack is flattened to enable rapid transmission of commands from the alpha wolf to each member, thereby accelerating convergence. Additionally, two distinct learning strategies are employed: the self-repulsion learning strategy for the alpha wolf and the pack learning strategy based on the predatory behavior of the alpha wolf, facilitating rapid self-learning for both the alpha wolf and the pack. These improvements effectively mitigate the weaknesses of traditional GWO, such as premature convergence and limited exploration capability. Finally, we conduct a comparative experimental analysis on the UCI test dataset using five relevant feature selection algorithms. The results demonstrate that the average classification error of GWO-SRS is reduced by approximately 15% compared to related algorithms, while utilizing 20% fewer features. This work highlights the need to address the inherent limitations of GWO and provides a robust solution to complex feature selection problems. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ISSN: | 2045-2322 2045-2322 |
| DOI: | 10.1038/s41598-025-97224-8 |