A-SFS: Semi-supervised feature selection based on multi-task self-supervision
Feature selection is an important process in machine learning. It builds an interpretable and robust model by selecting the features that contribute the most to the prediction target. However, most mature feature selection algorithms, including supervised and semi-supervised, fail to fully exploit t...
Uloženo v:
| Vydáno v: | Knowledge-based systems Ročník 252; s. 109449 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Elsevier B.V
27.09.2022
|
| Témata: | |
| ISSN: | 0950-7051, 1872-7409 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Abstract | Feature selection is an important process in machine learning. It builds an interpretable and robust model by selecting the features that contribute the most to the prediction target. However, most mature feature selection algorithms, including supervised and semi-supervised, fail to fully exploit the complex potential structure between features. We believe that these structures are very important for the feature selection process, especially when labels are lacking and data is noisy.
To this end, we innovatively introduces a deep learning-based self-supervised mechanism into feature selection problems, namely batch-Attention-based Self-supervision Feature Selection(A-SFS). Firstly, a multi-task self-supervised autoencoder is designed to uncover the hidden structural among features with the support of two pretext tasks. Guided by the integrated information from the multi-self-supervised learning model, a batch-attention mechanism is designed to generate feature weights according to batch-based feature selection patterns to alleviate the impacts introduced from a handful of noisy data. This method is compared to 14 major strong benchmarks, including LightGBM and XGBoost. Experimental results show that A-SFS achieves the highest accuracy in most datasets. Furthermore, this design significantly reduces the reliance on labels, with only 1/10 labeled data are needed to achieve the same performance as those state of art baselines. Results show that A-SFS is also most robust to the noisy and missing data.
•A new feature selection method based on self-supervised pattern discovery.•A multi-task self-supervised model for latent structure discovery.•Batch-attention-based feature weight generation. |
|---|---|
| AbstractList | Feature selection is an important process in machine learning. It builds an interpretable and robust model by selecting the features that contribute the most to the prediction target. However, most mature feature selection algorithms, including supervised and semi-supervised, fail to fully exploit the complex potential structure between features. We believe that these structures are very important for the feature selection process, especially when labels are lacking and data is noisy.
To this end, we innovatively introduces a deep learning-based self-supervised mechanism into feature selection problems, namely batch-Attention-based Self-supervision Feature Selection(A-SFS). Firstly, a multi-task self-supervised autoencoder is designed to uncover the hidden structural among features with the support of two pretext tasks. Guided by the integrated information from the multi-self-supervised learning model, a batch-attention mechanism is designed to generate feature weights according to batch-based feature selection patterns to alleviate the impacts introduced from a handful of noisy data. This method is compared to 14 major strong benchmarks, including LightGBM and XGBoost. Experimental results show that A-SFS achieves the highest accuracy in most datasets. Furthermore, this design significantly reduces the reliance on labels, with only 1/10 labeled data are needed to achieve the same performance as those state of art baselines. Results show that A-SFS is also most robust to the noisy and missing data.
•A new feature selection method based on self-supervised pattern discovery.•A multi-task self-supervised model for latent structure discovery.•Batch-attention-based feature weight generation. |
| ArticleNumber | 109449 |
| Author | Zeng, Wanxin Gui, Ning Liao, Dahua Qiu, Zhifeng |
| Author_xml | – sequence: 1 givenname: Zhifeng surname: Qiu fullname: Qiu, Zhifeng organization: School of Automation, Central South University, Changsha, 410083, Hunan, China – sequence: 2 givenname: Wanxin surname: Zeng fullname: Zeng, Wanxin organization: School of Automation, Central South University, Changsha, 410083, Hunan, China – sequence: 3 givenname: Dahua surname: Liao fullname: Liao, Dahua organization: School of Automation, Central South University, Changsha, 410083, Hunan, China – sequence: 4 givenname: Ning orcidid: 0000-0003-4983-5327 surname: Gui fullname: Gui, Ning email: ninggui@csu.edu.cn organization: School of Computer Science and Engineering, Central South University, Changsha, 410083, Hunan, China |
| BookMark | eNqFkM9KAzEQh4NUsK2-gYd9gdRk82e7PQilWBUqHqrnkE1mIe12tyRpoW9vlhUPHvQ0w8x8A79vgkZt1wJC95TMKKHyYTfbt124hFlO8jyNSs7LKzSm8yLHBSflCI1JKQguiKA3aBLCjpB0Sedj9LbE2_V2kW3h4HA4HcGfXQCb1aDjyUMWoAETXddmle7nqTmcmuhw1GHfb-sfKh3doutaNwHuvusUfa6fPlYvePP-_LpabrBhREYsrWCFFDk3ohISeMVsKalkOVBr56JgvJYgBdNMcACwFWighBFqKwuFNGyK-PDX-C4ED7U6enfQ_qIoUb0StVODEtUrUYOShC1-YcZF3aeLXrvmP_hxgCEFOzvwKhgHrQHrfFKkbOf-fvAF7uGCkQ |
| CitedBy_id | crossref_primary_10_1007_s10489_024_05760_z crossref_primary_10_1145_3689428 crossref_primary_10_1186_s40537_023_00792_7 crossref_primary_10_1016_j_knosys_2023_111084 crossref_primary_10_1016_j_knosys_2024_112040 crossref_primary_10_1016_j_knosys_2023_110479 crossref_primary_10_1016_j_knosys_2024_111523 crossref_primary_10_1016_j_segan_2023_101271 crossref_primary_10_3390_rs14205257 crossref_primary_10_3390_a18090552 crossref_primary_10_1016_j_patcog_2025_112280 |
| Cites_doi | 10.1145/1553374.1553431 10.1109/CVPR.2018.00813 10.1145/3436891 10.1145/3136625 10.1109/TNNLS.2016.2551724 10.1089/cmb.2015.0189 10.1109/TMM.2019.2924576 10.1016/S1874-1029(14)60362-1 10.1016/j.knosys.2021.107435 10.1007/s10994-017-5648-2 10.1145/2939672.2939785 10.1609/aaai.v33i01.33013705 10.1016/j.knosys.2017.06.018 10.1016/j.patcog.2016.11.003 10.1609/aaai.v28i1.8922 10.1016/j.knosys.2011.04.014 10.1609/aaai.v33i01.33013983 10.1109/TIE.2014.2301773 10.1145/3335676 10.1109/TKDE.2015.2493537 10.1016/j.eswa.2015.07.007 10.1109/CVPR.2019.00375 |
| ContentType | Journal Article |
| Copyright | 2022 Elsevier B.V. |
| Copyright_xml | – notice: 2022 Elsevier B.V. |
| DBID | AAYXX CITATION |
| DOI | 10.1016/j.knosys.2022.109449 |
| DatabaseName | CrossRef |
| DatabaseTitle | CrossRef |
| DatabaseTitleList | |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 1872-7409 |
| ExternalDocumentID | 10_1016_j_knosys_2022_109449 S0950705122007122 |
| GroupedDBID | --K --M .DC .~1 0R~ 1B1 1~. 1~5 4.4 457 4G. 5VS 7-5 71M 77K 8P~ 9JN AACTN AAEDT AAEDW AAIAV AAIKJ AAKOC AALRI AAOAW AAQFI AAXUO AAYFN ABAOU ABBOA ABIVO ABJNI ABMAC ABYKQ ACAZW ACDAQ ACGFS ACRLP ACZNC ADBBV ADEZE ADGUI ADTZH AEBSH AECPX AEKER AENEX AFKWA AFTJW AGHFR AGUBO AGYEJ AHHHB AHJVU AHZHX AIALX AIEXJ AIKHN AITUG AJOXV ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ AOUOD ARUGR AXJTR BJAXD BKOJK BLXMC CS3 DU5 EBS EFJIC EFLBG EO8 EO9 EP2 EP3 FDB FIRID FNPLU FYGXN G-Q GBLVA GBOLZ IHE J1W JJJVA KOM LG9 LY7 M41 MHUIS MO0 N9A O-L O9- OAUVE OZT P-8 P-9 P2P PC. PQQKQ Q38 ROL RPZ SDF SDG SDP SES SPC SPCBC SST SSV SSW SSZ T5K WH7 XPP ZMT ~02 ~G- 29L 77I 9DU AAQXK AATTM AAXKI AAYWO AAYXX ABDPE ABWVN ABXDB ACLOT ACNNM ACRPL ACVFH ADCNI ADJOM ADMUD ADNMO AEIPS AEUPX AFJKZ AFPUW AGQPQ AIGII AIIUN AKBMS AKRWK AKYEP ANKPU APXCP ASPBG AVWKF AZFZN CITATION EFKBS EJD FEDTE FGOYB G-2 HLZ HVGLF HZ~ R2- SBC SET SEW UHS WUQ ~HD |
| ID | FETCH-LOGICAL-c306t-6d5376524c5b56e4b3d961632e1dd85734f6e653a354eeedbeae10301dbde76c3 |
| ISICitedReferencesCount | 10 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001336368800001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0950-7051 |
| IngestDate | Sat Nov 29 07:07:58 EST 2025 Tue Nov 18 22:20:17 EST 2025 Fri Feb 23 02:40:21 EST 2024 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Keywords | Attention mechanism Self-supervised Feature selection Autoencoder |
| Language | English |
| LinkModel | OpenURL |
| MergedId | FETCHMERGED-LOGICAL-c306t-6d5376524c5b56e4b3d961632e1dd85734f6e653a354eeedbeae10301dbde76c3 |
| ORCID | 0000-0003-4983-5327 |
| ParticipantIDs | crossref_primary_10_1016_j_knosys_2022_109449 crossref_citationtrail_10_1016_j_knosys_2022_109449 elsevier_sciencedirect_doi_10_1016_j_knosys_2022_109449 |
| PublicationCentury | 2000 |
| PublicationDate | 2022-09-27 |
| PublicationDateYYYYMMDD | 2022-09-27 |
| PublicationDate_xml | – month: 09 year: 2022 text: 2022-09-27 day: 27 |
| PublicationDecade | 2020 |
| PublicationTitle | Knowledge-based systems |
| PublicationYear | 2022 |
| Publisher | Elsevier B.V |
| Publisher_xml | – name: Elsevier B.V |
| References | Venkatesh, Anuradha (b12) 2019; 19 Alweshah, Alkhalaileh, Al-Betar, Bakar (b10) 2021 Xiao, Cao, Jiang, Gu, Xie (b36) 2017; 132 N. Gui, D. Ge, Z. Hu, AFS: An attention-based mechanism for supervised feature selection, in: Proceedings of the AAAI Conference on Artificial Intelligence, 33 (01), 2019, pp. 3705–3713. Liu, Ji, Ye (b44) 2012 Ke, Meng, Finley, Wang, Chen, Ma, Ye, Liu (b22) 2017; 30 Uğuz (b6) 2011; 24 Li, Chen, Wasserman (b30) 2016; 23 Yin, Ding, Xie, Luo (b1) 2014; 61 Bennasar, Hicks, Setchi (b7) 2015; 42 Brown, Mann, Ryder, Subbiah, Kaplan, Dhariwal, Neelakantan, Shyam, Sastry, Askell (b19) 2020 Roy, Murty, Mohan (b31) 2015 Goldman, Zhou (b24) 2000 Sheikhpour, Sarram, Gharaghani, Chahooki (b13) 2017; 64 Nair, Pong, Dalal, Bahl, Lin, Levine (b20) 2018 Yoon, Zhang, Jordon, van der Schaar (b17) 2020; 33 Vaswani, Shazeer, Parmar, Uszkoreit, Jones, Gomez, Kaiser, Polosukhin (b41) 2017 T. Chen, C. Guestrin, Xgboost: A scalable tree boosting system, in: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, 2016, pp. 785–794. Wojtas, Chen (b34) 2020; 33 L. Jacob, G. Obozinski, J.-P. Vert, Group lasso with overlap and graph lasso, in: Proceedings of the 26th Annual International Conference on Machine Learning, 2009, pp. 433–440. X. Wang, R. Girshick, A. Gupta, K. He, Non-local neural networks, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018, pp. 7794–7803. Guyon, Elisseeff (b25) 2003; 3 Yu, Liu, Li (b16) 2021; 15 Ding-Cheng, Feng, Wen-Li (b28) 2014; 40 Li, Cheng, Wang, Morstatter, Trevino, Tang, Liu (b2) 2017; 50 Nie, Xiang, Jia, Zhang, Yan (b46) 2008 Kingma, Ba (b48) 2014 Mnih, Heess, Graves (b38) 2014 Yan, Tu, Wang, Zhang, Hao, Zhang, Dai (b42) 2019; 22 Škrlj, Džeroski, Lavrač, Petković (b33) 2020 Sechidis, Brown (b14) 2018; 107 Yang, Shen, Ma, Huang, Zhou (b47) 2011 B. Jiang, X. Wu, K. Yu, H. Chen, Joint semi-supervised feature selection and classification through Bayesian approach, in: Proceedings of the AAAI Conference on Artificial Intelligence, 33 (01) 2019, pp.3983–3990. Wang, Ye (b27) 2015; 28 Li, Wang, Ruiz (b8) 2020 Zhou (b5) 2021 Koller, Sahami (b15) 1996 Q. Yang, H.-X. Yu, A. Wu, W.-S. Zheng, Patch-based discriminative feature learning for unsupervised person re-identification, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 3633–3642. Ling, Yu, Wang, Liu, Ding, Wu (b21) 2019; 10 X. Chang, F. Nie, Y. Yang, H. Huang, A convex formulation for semi-supervised multi-label feature selection, in: Proceedings of the AAAI Conference on Artificial Intelligence, 28 (1) 2014. Parikh, Täckström, Das, Uszkoreit (b39) 2016 Arık, Pfister (b29) 2021 Li, Wang, Zhang (b11) 2021; 231 Gui, Sun, Ji, Tao, Tan (b9) 2017; 28 Chorowski, Bahdanau, Serdyuk, Cho, Bengio (b40) 2015 Zhao, He, Cai, Zhang, Ng, Zhuang (b4) 2015; 28 Ang, Haron, Hamed (b37) 2015 Lin, Tang (b45) 2006 Ke (10.1016/j.knosys.2022.109449_b22) 2017; 30 Zhao (10.1016/j.knosys.2022.109449_b4) 2015; 28 Li (10.1016/j.knosys.2022.109449_b8) 2020 Xiao (10.1016/j.knosys.2022.109449_b36) 2017; 132 10.1016/j.knosys.2022.109449_b23 10.1016/j.knosys.2022.109449_b43 Yin (10.1016/j.knosys.2022.109449_b1) 2014; 61 Li (10.1016/j.knosys.2022.109449_b2) 2017; 50 Wang (10.1016/j.knosys.2022.109449_b27) 2015; 28 Roy (10.1016/j.knosys.2022.109449_b31) 2015 Wojtas (10.1016/j.knosys.2022.109449_b34) 2020; 33 Nie (10.1016/j.knosys.2022.109449_b46) 2008 10.1016/j.knosys.2022.109449_b26 Lin (10.1016/j.knosys.2022.109449_b45) 2006 Arık (10.1016/j.knosys.2022.109449_b29) 2021 Yu (10.1016/j.knosys.2022.109449_b16) 2021; 15 10.1016/j.knosys.2022.109449_b3 Venkatesh (10.1016/j.knosys.2022.109449_b12) 2019; 19 Chorowski (10.1016/j.knosys.2022.109449_b40) 2015 Gui (10.1016/j.knosys.2022.109449_b9) 2017; 28 Yoon (10.1016/j.knosys.2022.109449_b17) 2020; 33 Yan (10.1016/j.knosys.2022.109449_b42) 2019; 22 Parikh (10.1016/j.knosys.2022.109449_b39) 2016 Brown (10.1016/j.knosys.2022.109449_b19) 2020 Ding-Cheng (10.1016/j.knosys.2022.109449_b28) 2014; 40 10.1016/j.knosys.2022.109449_b35 Guyon (10.1016/j.knosys.2022.109449_b25) 2003; 3 10.1016/j.knosys.2022.109449_b32 Goldman (10.1016/j.knosys.2022.109449_b24) 2000 Liu (10.1016/j.knosys.2022.109449_b44) 2012 Sechidis (10.1016/j.knosys.2022.109449_b14) 2018; 107 Yang (10.1016/j.knosys.2022.109449_b47) 2011 Sheikhpour (10.1016/j.knosys.2022.109449_b13) 2017; 64 10.1016/j.knosys.2022.109449_b18 Škrlj (10.1016/j.knosys.2022.109449_b33) 2020 Ang (10.1016/j.knosys.2022.109449_b37) 2015 Zhou (10.1016/j.knosys.2022.109449_b5) 2021 Ling (10.1016/j.knosys.2022.109449_b21) 2019; 10 Alweshah (10.1016/j.knosys.2022.109449_b10) 2021 Vaswani (10.1016/j.knosys.2022.109449_b41) 2017 Bennasar (10.1016/j.knosys.2022.109449_b7) 2015; 42 Koller (10.1016/j.knosys.2022.109449_b15) 1996 Kingma (10.1016/j.knosys.2022.109449_b48) 2014 Uğuz (10.1016/j.knosys.2022.109449_b6) 2011; 24 Li (10.1016/j.knosys.2022.109449_b11) 2021; 231 Li (10.1016/j.knosys.2022.109449_b30) 2016; 23 Mnih (10.1016/j.knosys.2022.109449_b38) 2014 Nair (10.1016/j.knosys.2022.109449_b20) 2018 |
| References_xml | – start-page: 1 year: 2015 end-page: 6 ident: b31 article-title: Feature selection using deep neural networks publication-title: 2015 International Joint Conference on Neural Networks (IJCNN) – reference: X. Wang, R. Girshick, A. Gupta, K. He, Non-local neural networks, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018, pp. 7794–7803. – start-page: 2204 year: 2014 end-page: 2212 ident: b38 article-title: Recurrent models of visual attention publication-title: Advances in Neural Information Processing Systems – year: 2014 ident: b48 article-title: Adam: A method for stochastic optimization – volume: 42 start-page: 8520 year: 2015 end-page: 8532 ident: b7 article-title: Feature selection using joint mutual information maximisation publication-title: Expert Syst. Appl. – start-page: 468 year: 2015 end-page: 477 ident: b37 article-title: Semi-supervised SVM-based feature selection for cancer classification using microarray gene expression data publication-title: International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems – volume: 231 year: 2021 ident: b11 article-title: A wind speed interval forecasting system based on constrained lower upper bound estimation and parallel feature selection publication-title: Knowl.-Based Syst. – year: 2020 ident: b19 article-title: Language models are few-shot learners – year: 2021 ident: b10 article-title: Coronavirus herd immunity optimizer with greedy crossover for feature selection in medical diagnosis publication-title: Knowl.-Based Syst. – year: 2021 ident: b5 article-title: Feature selection with multi-source transfer publication-title: IEEE Trans. Circuits Syst. Video Technol. – volume: 10 start-page: 1 year: 2019 end-page: 25 ident: b21 article-title: Bamb: A balanced Markov blanket discovery approach to feature selection publication-title: ACM Trans. Intell. Syst. Technol. (TIST) – start-page: 6679 year: 2021 end-page: 6687 ident: b29 article-title: Tabnet: Attentive interpretable tabular learning publication-title: AAAI, Vol. 35 – reference: X. Chang, F. Nie, Y. Yang, H. Huang, A convex formulation for semi-supervised multi-label feature selection, in: Proceedings of the AAAI Conference on Artificial Intelligence, 28 (1) 2014. – start-page: 671 year: 2008 end-page: 676 ident: b46 article-title: Trace ratio criterion for feature selection. publication-title: AAAI, Vol. 2 – volume: 28 start-page: 689 year: 2015 end-page: 700 ident: b4 article-title: Graph regularized feature selection with data reconstruction publication-title: IEEE Trans. Knowl. Data Eng. – volume: 107 start-page: 357 year: 2018 end-page: 395 ident: b14 article-title: Simple strategies for semi-supervised feature selection publication-title: Mach. Learn. – year: 1996 ident: b15 article-title: Toward optimal feature selection – year: 2012 ident: b44 article-title: Multi-task feature learning via efficient l2, 1-norm minimization – volume: 64 start-page: 141 year: 2017 end-page: 158 ident: b13 article-title: A survey on semi-supervised feature selection methods publication-title: Pattern Recognit. – volume: 28 start-page: 1490 year: 2017 end-page: 1507 ident: b9 article-title: Feature selection based on structured sparsity: A comprehensive study publication-title: IEEE Trans. Neural Netw. Learn. Syst. – start-page: 1491 year: 2020 end-page: 1498 ident: b33 article-title: Feature importance estimation with self-attention networks publication-title: ECAI 2020 – reference: L. Jacob, G. Obozinski, J.-P. Vert, Group lasso with overlap and graph lasso, in: Proceedings of the 26th Annual International Conference on Machine Learning, 2009, pp. 433–440. – volume: 24 start-page: 1024 year: 2011 end-page: 1032 ident: b6 article-title: A two-stage feature selection method for text categorization by using information gain, principal component analysis and genetic algorithm publication-title: Knowl.-Based Syst. – volume: 33 year: 2020 ident: b17 article-title: Vime: Extending the success of self-and semi-supervised learning to tabular domain publication-title: Adv. Neural Inf. Process. Syst. – reference: N. Gui, D. Ge, Z. Hu, AFS: An attention-based mechanism for supervised feature selection, in: Proceedings of the AAAI Conference on Artificial Intelligence, 33 (01), 2019, pp. 3705–3713. – volume: 40 start-page: 2253 year: 2014 end-page: 2261 ident: b28 article-title: Detecting local manifold structure for unsupervised feature selection publication-title: Acta Automat. Sinica – volume: 50 start-page: 1 year: 2017 end-page: 45 ident: b2 article-title: Feature selection: A data perspective publication-title: ACM Comput. Surv. – start-page: 68 year: 2006 end-page: 82 ident: b45 article-title: Conditional infomax learning: An integrated framework for feature extraction and fusion publication-title: European Conference on Computer Vision – year: 2011 ident: b47 article-title: L2, 1-norm regularized discriminative feature selection for unsupervised publication-title: Twenty-Second International Joint Conference on Artificial Intelligence – year: 2020 ident: b8 article-title: A survey on sparse learning models for feature selection publication-title: IEEE Trans. Cybern. – year: 2015 ident: b40 article-title: Attention-based models for speech recognition – reference: Q. Yang, H.-X. Yu, A. Wu, W.-S. Zheng, Patch-based discriminative feature learning for unsupervised person re-identification, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 3633–3642. – volume: 15 start-page: 1 year: 2021 end-page: 46 ident: b16 article-title: A unified view of causal and non-causal feature selection publication-title: ACM Trans. Knowll. Discov. Data – reference: T. Chen, C. Guestrin, Xgboost: A scalable tree boosting system, in: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, 2016, pp. 785–794. – volume: 22 start-page: 229 year: 2019 end-page: 241 ident: b42 article-title: STAT: Spatial-temporal attention mechanism for video captioning publication-title: IEEE Trans. Multimed. – volume: 61 start-page: 6418 year: 2014 end-page: 6428 ident: b1 article-title: A review on basic data-driven approaches for industrial process monitoring publication-title: IEEE Trans. Ind. Electron. – reference: B. Jiang, X. Wu, K. Yu, H. Chen, Joint semi-supervised feature selection and classification through Bayesian approach, in: Proceedings of the AAAI Conference on Artificial Intelligence, 33 (01) 2019, pp.3983–3990. – volume: 23 start-page: 322 year: 2016 end-page: 336 ident: b30 article-title: Deep feature selection: theory and application to identify enhancers and promoters publication-title: J. Comput. Biol. – volume: 132 start-page: 236 year: 2017 end-page: 248 ident: b36 article-title: GMDH-based semi-supervised feature selection for customer classification publication-title: Knowl.-Based Syst. – start-page: 5998 year: 2017 end-page: 6008 ident: b41 article-title: Attention is all you need publication-title: Advances in Neural Information Processing Systems – start-page: 327 year: 2000 end-page: 334 ident: b24 article-title: Enhancing supervised learning with unlabeled data publication-title: ICML – volume: 33 start-page: 5105 year: 2020 end-page: 5114 ident: b34 article-title: Feature importance ranking for deep learning publication-title: Adv. Neural Inf. Process. Syst. – volume: 30 start-page: 3146 year: 2017 end-page: 3154 ident: b22 article-title: Lightgbm: A highly efficient gradient boosting decision tree publication-title: Adv. Neural Inf. Process. Syst. – volume: 3 start-page: 1157 year: 2003 end-page: 1182 ident: b25 article-title: An introduction to variable and feature selection publication-title: J. Mach. Learn. Res. – volume: 28 start-page: 1279 year: 2015 end-page: 1287 ident: b27 article-title: Multi-layer feature reduction for tree structured group lasso via hierarchical projection publication-title: Adv. Neural Inf. Process. Syst. – year: 2016 ident: b39 article-title: A decomposable attention model for natural language inference – volume: 19 start-page: 3 year: 2019 end-page: 26 ident: b12 article-title: A review of feature selection and its methods publication-title: Cybern. Inf. Technol. – year: 2018 ident: b20 article-title: Visual reinforcement learning with imagined goals – ident: 10.1016/j.knosys.2022.109449_b3 doi: 10.1145/1553374.1553431 – start-page: 5998 year: 2017 ident: 10.1016/j.knosys.2022.109449_b41 article-title: Attention is all you need – ident: 10.1016/j.knosys.2022.109449_b43 doi: 10.1109/CVPR.2018.00813 – volume: 15 start-page: 1 issue: 4 year: 2021 ident: 10.1016/j.knosys.2022.109449_b16 article-title: A unified view of causal and non-causal feature selection publication-title: ACM Trans. Knowll. Discov. Data doi: 10.1145/3436891 – start-page: 1491 year: 2020 ident: 10.1016/j.knosys.2022.109449_b33 article-title: Feature importance estimation with self-attention networks – volume: 50 start-page: 1 issue: 6 year: 2017 ident: 10.1016/j.knosys.2022.109449_b2 article-title: Feature selection: A data perspective publication-title: ACM Comput. Surv. doi: 10.1145/3136625 – start-page: 1 year: 2015 ident: 10.1016/j.knosys.2022.109449_b31 article-title: Feature selection using deep neural networks – volume: 30 start-page: 3146 year: 2017 ident: 10.1016/j.knosys.2022.109449_b22 article-title: Lightgbm: A highly efficient gradient boosting decision tree publication-title: Adv. Neural Inf. Process. Syst. – start-page: 468 year: 2015 ident: 10.1016/j.knosys.2022.109449_b37 article-title: Semi-supervised SVM-based feature selection for cancer classification using microarray gene expression data – start-page: 2204 year: 2014 ident: 10.1016/j.knosys.2022.109449_b38 article-title: Recurrent models of visual attention – year: 2015 ident: 10.1016/j.knosys.2022.109449_b40 – start-page: 6679 year: 2021 ident: 10.1016/j.knosys.2022.109449_b29 article-title: Tabnet: Attentive interpretable tabular learning – volume: 28 start-page: 1490 issue: 7 year: 2017 ident: 10.1016/j.knosys.2022.109449_b9 article-title: Feature selection based on structured sparsity: A comprehensive study publication-title: IEEE Trans. Neural Netw. Learn. Syst. doi: 10.1109/TNNLS.2016.2551724 – volume: 23 start-page: 322 issue: 5 year: 2016 ident: 10.1016/j.knosys.2022.109449_b30 article-title: Deep feature selection: theory and application to identify enhancers and promoters publication-title: J. Comput. Biol. doi: 10.1089/cmb.2015.0189 – volume: 22 start-page: 229 issue: 1 year: 2019 ident: 10.1016/j.knosys.2022.109449_b42 article-title: STAT: Spatial-temporal attention mechanism for video captioning publication-title: IEEE Trans. Multimed. doi: 10.1109/TMM.2019.2924576 – start-page: 671 year: 2008 ident: 10.1016/j.knosys.2022.109449_b46 article-title: Trace ratio criterion for feature selection. – volume: 40 start-page: 2253 issue: 10 year: 2014 ident: 10.1016/j.knosys.2022.109449_b28 article-title: Detecting local manifold structure for unsupervised feature selection publication-title: Acta Automat. Sinica doi: 10.1016/S1874-1029(14)60362-1 – volume: 231 year: 2021 ident: 10.1016/j.knosys.2022.109449_b11 article-title: A wind speed interval forecasting system based on constrained lower upper bound estimation and parallel feature selection publication-title: Knowl.-Based Syst. doi: 10.1016/j.knosys.2021.107435 – volume: 107 start-page: 357 issue: 2 year: 2018 ident: 10.1016/j.knosys.2022.109449_b14 article-title: Simple strategies for semi-supervised feature selection publication-title: Mach. Learn. doi: 10.1007/s10994-017-5648-2 – ident: 10.1016/j.knosys.2022.109449_b23 doi: 10.1145/2939672.2939785 – ident: 10.1016/j.knosys.2022.109449_b32 doi: 10.1609/aaai.v33i01.33013705 – volume: 132 start-page: 236 year: 2017 ident: 10.1016/j.knosys.2022.109449_b36 article-title: GMDH-based semi-supervised feature selection for customer classification publication-title: Knowl.-Based Syst. doi: 10.1016/j.knosys.2017.06.018 – volume: 64 start-page: 141 year: 2017 ident: 10.1016/j.knosys.2022.109449_b13 article-title: A survey on semi-supervised feature selection methods publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2016.11.003 – ident: 10.1016/j.knosys.2022.109449_b35 doi: 10.1609/aaai.v28i1.8922 – year: 2014 ident: 10.1016/j.knosys.2022.109449_b48 – start-page: 327 year: 2000 ident: 10.1016/j.knosys.2022.109449_b24 article-title: Enhancing supervised learning with unlabeled data – volume: 33 start-page: 5105 year: 2020 ident: 10.1016/j.knosys.2022.109449_b34 article-title: Feature importance ranking for deep learning publication-title: Adv. Neural Inf. Process. Syst. – year: 2012 ident: 10.1016/j.knosys.2022.109449_b44 – volume: 3 start-page: 1157 issue: Mar year: 2003 ident: 10.1016/j.knosys.2022.109449_b25 article-title: An introduction to variable and feature selection publication-title: J. Mach. Learn. Res. – volume: 24 start-page: 1024 issue: 7 year: 2011 ident: 10.1016/j.knosys.2022.109449_b6 article-title: A two-stage feature selection method for text categorization by using information gain, principal component analysis and genetic algorithm publication-title: Knowl.-Based Syst. doi: 10.1016/j.knosys.2011.04.014 – year: 2021 ident: 10.1016/j.knosys.2022.109449_b5 article-title: Feature selection with multi-source transfer publication-title: IEEE Trans. Circuits Syst. Video Technol. – ident: 10.1016/j.knosys.2022.109449_b26 doi: 10.1609/aaai.v33i01.33013983 – volume: 61 start-page: 6418 issue: 11 year: 2014 ident: 10.1016/j.knosys.2022.109449_b1 article-title: A review on basic data-driven approaches for industrial process monitoring publication-title: IEEE Trans. Ind. Electron. doi: 10.1109/TIE.2014.2301773 – year: 2020 ident: 10.1016/j.knosys.2022.109449_b19 – start-page: 68 year: 2006 ident: 10.1016/j.knosys.2022.109449_b45 article-title: Conditional infomax learning: An integrated framework for feature extraction and fusion – year: 2016 ident: 10.1016/j.knosys.2022.109449_b39 – volume: 33 year: 2020 ident: 10.1016/j.knosys.2022.109449_b17 article-title: Vime: Extending the success of self-and semi-supervised learning to tabular domain publication-title: Adv. Neural Inf. Process. Syst. – volume: 10 start-page: 1 issue: 5 year: 2019 ident: 10.1016/j.knosys.2022.109449_b21 article-title: Bamb: A balanced Markov blanket discovery approach to feature selection publication-title: ACM Trans. Intell. Syst. Technol. (TIST) doi: 10.1145/3335676 – volume: 28 start-page: 689 issue: 3 year: 2015 ident: 10.1016/j.knosys.2022.109449_b4 article-title: Graph regularized feature selection with data reconstruction publication-title: IEEE Trans. Knowl. Data Eng. doi: 10.1109/TKDE.2015.2493537 – year: 2018 ident: 10.1016/j.knosys.2022.109449_b20 – volume: 42 start-page: 8520 issue: 22 year: 2015 ident: 10.1016/j.knosys.2022.109449_b7 article-title: Feature selection using joint mutual information maximisation publication-title: Expert Syst. Appl. doi: 10.1016/j.eswa.2015.07.007 – year: 2020 ident: 10.1016/j.knosys.2022.109449_b8 article-title: A survey on sparse learning models for feature selection publication-title: IEEE Trans. Cybern. – year: 2021 ident: 10.1016/j.knosys.2022.109449_b10 article-title: Coronavirus herd immunity optimizer with greedy crossover for feature selection in medical diagnosis publication-title: Knowl.-Based Syst. – year: 1996 ident: 10.1016/j.knosys.2022.109449_b15 – volume: 19 start-page: 3 issue: 1 year: 2019 ident: 10.1016/j.knosys.2022.109449_b12 article-title: A review of feature selection and its methods publication-title: Cybern. Inf. Technol. – year: 2011 ident: 10.1016/j.knosys.2022.109449_b47 article-title: L2, 1-norm regularized discriminative feature selection for unsupervised – ident: 10.1016/j.knosys.2022.109449_b18 doi: 10.1109/CVPR.2019.00375 – volume: 28 start-page: 1279 year: 2015 ident: 10.1016/j.knosys.2022.109449_b27 article-title: Multi-layer feature reduction for tree structured group lasso via hierarchical projection publication-title: Adv. Neural Inf. Process. Syst. |
| SSID | ssj0002218 |
| Score | 2.398389 |
| Snippet | Feature selection is an important process in machine learning. It builds an interpretable and robust model by selecting the features that contribute the most... |
| SourceID | crossref elsevier |
| SourceType | Enrichment Source Index Database Publisher |
| StartPage | 109449 |
| SubjectTerms | Attention mechanism Autoencoder Feature selection Self-supervised |
| Title | A-SFS: Semi-supervised feature selection based on multi-task self-supervision |
| URI | https://dx.doi.org/10.1016/j.knosys.2022.109449 |
| Volume | 252 |
| WOSCitedRecordID | wos001336368800001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVESC databaseName: Elsevier SD Freedom Collection Journals 2021 customDbUrl: eissn: 1872-7409 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0002218 issn: 0950-7051 databaseCode: AIEXJ dateStart: 19950201 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEF6FlAMXoAVEeWkP3KqNEtv7MLcItUApVaW0kJtle8eq2-JEdVzl5zP7sNOmVaEHLpbleNaO59PszOzsN4R8zIYawxvJmQYhWJTGaAdTAUxhOKRFrlVm2T5_HsjDQzWdxke9XtHuhbm6kFWllst4_l9VjddQ2Wbr7APU3Q2KF_AclY5HVDse_0nxYzbZm5hAfwK_S1Y3c2MNavQrC7Aknju1bX1j1G6mMG2WC2xZIVuk9bn5teikWqV57_V7m4BjTrK-RndukqdlY9c6TssC_IRoUtLgzMmvtFqWqwKgMp25mvrTppsZvjSlQ6eX9tkIDGTNAo28kVYcMjn0JLLewgaOpNbbyBFGlI6m9Jb5dpmEs8F5NcN_MDAPGKxuv8mWvTaLdbWFbdnaWeJGScwoiRvlEdkIJI9Vn2yMv-1O97s5OwhsJrh7-3aTpa0EvP02dzsx1xyT4-fkqY8o6NghYZP0oNoiz9puHdQb7xfkhwXGJ7oGC-phQTtYUKtciicrWNB1WLwkJ3u7x5-_Mt9Mg-UYFS6Y0Ia4hwdRzjMuIMpCHQt0xgMYaa24DKNCgOBhGvII0HHKIAXTgm6kMw1S5OEr0q9mFbwmFIpQ8HyUK8BgGuNRQ2oodCxjrWSgdLFNwvbrJLlnmjcNTy6S-3SzTVgnNXdMK3-5X7YfPvHeovMCE0TTvZJvHvikt-TJCurvSH9x2cB78ji_WpT15QcPpT8J_Y1v |
| linkProvider | Elsevier |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A-SFS%3A+Semi-supervised+feature+selection+based+on+multi-task+self-supervision&rft.jtitle=Knowledge-based+systems&rft.au=Qiu%2C+Zhifeng&rft.au=Zeng%2C+Wanxin&rft.au=Liao%2C+Dahua&rft.au=Gui%2C+Ning&rft.date=2022-09-27&rft.issn=0950-7051&rft.volume=252&rft.spage=109449&rft_id=info:doi/10.1016%2Fj.knosys.2022.109449&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_knosys_2022_109449 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0950-7051&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0950-7051&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0950-7051&client=summon |