Multi-Strategy Assisted Multi-Objective Whale Optimization Algorithm for Feature Selection
In classification problems, datasets often contain a large amount of features, but not all of them are relevant for accurate classification. In fact, irrelevant features may even hinder classification accuracy. Feature selection aims to alleviate this issue by minimizing the number of features in th...
Gespeichert in:
| Veröffentlicht in: | Computer modeling in engineering & sciences Jg. 140; H. 2; S. 1563 - 1593 |
|---|---|
| Hauptverfasser: | , , , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
Henderson
Tech Science Press
2024
|
| Schlagworte: | |
| ISSN: | 1526-1506, 1526-1492, 1526-1506 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | In classification problems, datasets often contain a large amount of features, but not all of them are relevant for accurate classification. In fact, irrelevant features may even hinder classification accuracy. Feature selection aims to alleviate this issue by minimizing the number of features in the subset while simultaneously minimizing the classification error rate. Single-objective optimization approaches employ an evaluation function designed as an aggregate function with a parameter, but the results obtained depend on the value of the parameter. To eliminate this parameter’s influence, the problem can be reformulated as a multi-objective optimization problem. The Whale Optimization Algorithm (WOA) is widely used in optimization problems because of its simplicity and easy implementation. In this paper, we propose a multi-strategy assisted multi-objective WOA (MSMOWOA) to address feature selection. To enhance the algorithm’s search ability, we integrate multiple strategies such as Levy flight, Grey Wolf Optimizer, and adaptive mutation into it. Additionally, we utilize an external repository to store non-dominant solution sets and grid technology is used to maintain diversity. Results on fourteen University of California Irvine (UCI) datasets demonstrate that our proposed method effectively removes redundant features and improves classification performance. The source code can be accessed from the website: . |
|---|---|
| Bibliographie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 1526-1506 1526-1492 1526-1506 |
| DOI: | 10.32604/cmes.2024.048049 |