Feature selection for binary classification based on class labeling, SOM, and hierarchical clustering
Feature selection plays an important role in algorithms for processing high-dimensional data. Traditional pattern classification and information theory methods are widely applied to feature selection methods. However, traditional pattern classification methods such as Fisher Score, Laplacian Score,...
Gespeichert in:
| Veröffentlicht in: | Measurement and control (London) Jg. 56; H. 9-10; S. 1649 - 1669 |
|---|---|
| Hauptverfasser: | , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
London, England
SAGE Publications
01.11.2023
Sage Publications Ltd SAGE Publishing |
| Schlagworte: | |
| ISSN: | 0020-2940, 2051-8730 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | Feature selection plays an important role in algorithms for processing high-dimensional data. Traditional pattern classification and information theory methods are widely applied to feature selection methods. However, traditional pattern classification methods such as Fisher Score, Laplacian Score, and relief use class labels inadequately. Previous information theory based feature selection methods such as MIFS ignore the intra-class to tight inter-class to sparse property of the samples. To address these problems, a feature selection algorithm for the binary classification problem is proposed, which is based on class label transformation using self-organizing mapping neural network (SOM) and cohesive hierarchical clustering. The algorithm first converts class labels without numerical meaning into numerical values that can participate in operations and retain classification information through class label mapping, and constitutes a two-dimensional vector from it and the attribute values to be judged. Then, these two-dimensional vectors are clustered by using SOM neural network and hierarchical clustering. Finally, evaluation function value is calculated, that is closely related to intra-cluster to tightness, inter-cluster separation, and division accuracy after clustering, and is used to evaluate the ability of alternative attributes to distinguish between classes. It is experimentally verified that the algorithm is robust and can effectively screen attributes with strong classification ability and improve the prediction performance of the classifier. |
|---|---|
| Bibliographie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0020-2940 2051-8730 |
| DOI: | 10.1177/00202940231173748 |