Innovative Feature Selection Method Based on Hybrid Sine Cosine and Dipper Throated Optimization Algorithms

Introduction: In pattern recognition and data mining, feature selection is one of the most crucial tasks. To increase the efficacy of classification algorithms, it is necessary to identify the most relevant subset of features in a given domain. This means that the feature selection challenge can be...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access Vol. 11; pp. 79750 - 79776
Main Authors: Abdelhamid, Abdelaziz A., El-Kenawy, El-Sayed M., Ibrahim, Abdelhameed, Eid, Marwa Metwally, Khafaga, Doaa Sami, Alhussan, Amel Ali, Mirjalili, Seyedali, Khodadadi, Nima, Lim, Wei Hong, Shams, Mahmoud Y.
Format: Journal Article
Language:English
Published: Piscataway IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:2169-3536, 2169-3536
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Introduction: In pattern recognition and data mining, feature selection is one of the most crucial tasks. To increase the efficacy of classification algorithms, it is necessary to identify the most relevant subset of features in a given domain. This means that the feature selection challenge can be seen as an optimization problem, and thus meta-heuristic techniques can be utilized to find a solution. Methodology: In this work, we propose a novel hybrid binary meta-heuristic algorithm to solve the feature selection problem by combining two algorithms: Dipper Throated Optimization (DTO) and Sine Cosine (SC) algorithm. The new algorithm is referred to as bSCWDTO. We employed the sine cosine algorithm to improve the exploration process and ensure the optimization algorithm converges quickly and accurately. Thirty datasets from the University of California Irvine (UCI) machine learning repository are used to evaluate the robustness and stability of the proposed bSCWDTO algorithm. In addition, the K-Nearest Neighbor (KNN) classifier is used to measure the selected features' effectiveness in classification problems. Results: The achieved results demonstrate the algorithm's superiority over ten state-of-the-art optimization methods, including the original DTO and SC, Particle Swarm Optimization (PSO), Whale Optimization Algorithm (WOA), Grey Wolf Optimization (GWO), Multiverse Optimization (MVO), Satin Bowerbird Optimizer (SBO), Genetic Algorithm (GA), the hybrid of GWO and GA, and Firefly Algorithm (FA). Moreover, Wilcoxon's rank-sum test was performed at the 0.05 significance level to study the statistical difference between the proposed method and the alternative feature selection methods. Conclusion: These results emphasized the proposed feature selection method's significance, superiority, and statistical difference.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3298955