Multiobjective feature selection for microarray data via distributed parallel algorithms

Many real-world problems are large in scale and hence difficult to address. Due to the large number of features in microarray datasets, feature selection and classification are even more challenging for such datasets. Not all of these numerous features contribute to the classification task, and some...

Full description

Saved in:
Bibliographic Details
Published in:Future generation computer systems Vol. 100; pp. 952 - 981
Main Authors: Cao, Bin, Zhao, Jianwei, Yang, Po, Yang, Peng, Liu, Xin, Qi, Jun, Simpson, Andrew, Elhoseny, Mohamed, Mehmood, Irfan, Muhammad, Khan
Format: Journal Article
Language:English
Published: Elsevier B.V 01.11.2019
Subjects:
ISSN:0167-739X, 1872-7115
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Many real-world problems are large in scale and hence difficult to address. Due to the large number of features in microarray datasets, feature selection and classification are even more challenging for such datasets. Not all of these numerous features contribute to the classification task, and some even impede performance. Through feature selection, a feature subset that contains only a small quantity of essential features can be generated to increase the classification accuracy and significantly reduce the time consumption. In this paper, we construct a multiobjective feature selection model that simultaneously considers the classification error, the feature number and the feature redundancy. For this model, we propose several distributed parallel algorithms based on different encodings and an adaptive strategy. Additionally, to reduce the time consumption, various tactics are employed, including a feature number constraint, distributed parallelism and sample-wise parallelism. For a batch of microarray datasets, the proposed algorithms are superior to several state-of-the-art multiobjective evolutionary algorithms in terms of both effectiveness and efficiency. •A multi-objective feature selection model is presented and tackled.•Algorithms with two encoding methodologies are proposed.•Adaptive technique is explored.•Explicit feature number threshold and distributed parallelism are employed for efficiency.
ISSN:0167-739X
1872-7115
DOI:10.1016/j.future.2019.02.030