Liver disease classification using histogram-based gradient boosting classification tree with feature selection algorithm

•This work proposes the Histogram-based Gradient Boosting Classification Tree for predicting liver diseases.•Proposed HGBoost, with the recursive feature selection algorithm, improved accuracy of 1–7% and reduced MSE.•The results of proposed algorithm will be more useful for physicians to make bette...

Full description

Saved in:
Bibliographic Details
Published in:Biomedical signal processing and control Vol. 100; p. 107102
Main Author: Theerthagiri, Prasannavenkatesan
Format: Journal Article
Language:English
Published: Elsevier Ltd 01.02.2025
Subjects:
ISSN:1746-8094
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•This work proposes the Histogram-based Gradient Boosting Classification Tree for predicting liver diseases.•Proposed HGBoost, with the recursive feature selection algorithm, improved accuracy of 1–7% and reduced MSE.•The results of proposed algorithm will be more useful for physicians to make better decisions for liver disease patients. Healthcare is the key for everyone to run daily life, and health diagnosing techniques should be accessible easily. Indeed, the early identification of liver disease will be supportive for physicians to make decisions. Utilizing feature selection and classification approaches, this work aims to predict liver disorders through machine learning. The Histogram-based Gradient Boosting Classification Tree with a recursive feature selection algorithm (HGBoost) is proposed in this paper. The recursive feature selection approach and the Gradient Boosting are used to forecast liver disease. Using data from Indian liver patient records, the proposed HGBoost method has been assessed. Assessing the accuracy, confusion matrix, and area under curve involves implementing and comparing a variety of classification techniques, including MLP, Gboost, Adaboost, and proposed HGBoost algorithms. With the help of the recursive feature selection technique, the proposed HGBoost has surpassed other current algorithms. In comparison to the MLP, RF, Gboost, Adaboost, and proposed HGBoost algorithms, the enhanced accuracy is between 4 and 9% and between 1 and 7 % of the MSE error.
ISSN:1746-8094
DOI:10.1016/j.bspc.2024.107102