A new fusion of grey wolf optimizer algorithm with a two-phase mutation for feature selection

•Anew Grey Wolf Optimizer algorithm with a Two-phase Mutation (TMGWO) is proposed.•Wrapper-based feature selection techniques are proposed using TMGWO algorithm.•The proposed algorithm is benchmarked on 35 standard UCI datasets.•TMGWO algorithm is compared with recent state-of-the-art algorithms.•A...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Expert systems with applications Ročník 139; s. 112824
Hlavní autoři: Abdel-Basset, Mohamed, El-Shahat, Doaa, El-henawy, Ibrahim, de Albuquerque, Victor Hugo C., Mirjalili, Seyedali
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York Elsevier Ltd 01.01.2020
Elsevier BV
Témata:
ISSN:0957-4174, 1873-6793
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:•Anew Grey Wolf Optimizer algorithm with a Two-phase Mutation (TMGWO) is proposed.•Wrapper-based feature selection techniques are proposed using TMGWO algorithm.•The proposed algorithm is benchmarked on 35 standard UCI datasets.•TMGWO algorithm is compared with recent state-of-the-art algorithms.•A superior performance of the proposed algorithm is proved in the experiments. Because of their high dimensionality, dealing with large datasets can hinder the data mining process. Thus, the feature selection is a pre-process mandatory phase for reducing the dimensionality of datasets through using the most informative features and at the same time maximizing the classification accuracy. This paper proposes a new Grey Wolf Optimizer algorithm integrated with a Two-phase Mutation to solve the feature selection for classification problems based on the wrapper methods. The sigmoid function is used to transform the continuous search space to the binary one in order to match the binary nature of the feature selection problem. The two-phase mutation enhances the exploitation capability of the algorithm. The purpose of the first mutation phase is to reduce the number of selected features while preserving high classification accuracy. The purpose of the second mutation phase is to attempt to add more informative features that increase the classification accuracy. As the mutation phase can be time-consuming, the two-phase mutation can be done with a small probability. The wrapper methods can give high-quality solutions so we use one of the most famous wrapper methods which called k-Nearest Neighbor (k-NN) classifier. The Euclidean distance is computed to search for the k-NN. Each dataset is split into training and testing data using K-fold cross-validation to overcome the overfitting problem. Several comparisons with the most famous and modern algorithms such as flower algorithm, particle swarm optimization algorithm, multi-verse optimizer algorithm, whale optimization algorithm, and bat algorithm are done. The experiments are done using 35 datasets. Statistical analyses are made to prove the effectiveness of the proposed algorithm and its outperformance.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2019.112824