Multiple strategies based Grey Wolf Optimizer for feature selection in performance evaluation of open-ended funds

The methods for selecting the features in evaluating fund performance rely heavily on traditional statistics, which can potentially lead to excessive data dimensions in a multi-dimensional context. Grey Wolf Optimizer (GWO), a swarm intelligence optimization algorithm with its simple structure and f...

Full description

Saved in:
Bibliographic Details
Published in:Swarm and evolutionary computation Vol. 86; p. 101518
Main Authors: Chang, Dan, Rao, Congjun, Xiao, Xinping, Hu, Fuyan, Goh, Mark
Format: Journal Article
Language:English
Published: Elsevier B.V 01.04.2024
Subjects:
ISSN:2210-6502
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The methods for selecting the features in evaluating fund performance rely heavily on traditional statistics, which can potentially lead to excessive data dimensions in a multi-dimensional context. Grey Wolf Optimizer (GWO), a swarm intelligence optimization algorithm with its simple structure and few parameters, is widely used in feature selection. However, the algorithm suffers from local optimality and the imbalance in exploration and exploitation. This paper proposes a Multi-Strategy Grey Wolf Optimizer (MSGWO) to address the limitations, and identify the relevant features for evaluating fund performance. Random Opposition-based Learning is applied to enhance population quality during the initialization phase. Moreover, the convergence factor is nonlinearized to coordinate the global exploration and local exploitation capabilities. Finally, a two-stage hybrid mutation operator is applied to modify the updating mechanism, so as to increase population diversity and balance the exploration and exploitation abilities of GWO. The proposed algorithm is compared against 6 related algorithms and verified by the Wilcoxon signed-rank test on 12 quarterly datasets (2020-2022) of Chinese open-ended funds. The results inform that MSGWO reduces the feature size as well as the classification error rate.
ISSN:2210-6502
DOI:10.1016/j.swevo.2024.101518