Liver disease classification using histogram-based gradient boosting classification tree with feature selection algorithm

•This work proposes the Histogram-based Gradient Boosting Classification Tree for predicting liver diseases.•Proposed HGBoost, with the recursive feature selection algorithm, improved accuracy of 1–7% and reduced MSE.•The results of proposed algorithm will be more useful for physicians to make bette...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Biomedical signal processing and control Ročník 100; s. 107102
Hlavní autor: Theerthagiri, Prasannavenkatesan
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Ltd 01.02.2025
Témata:
ISSN:1746-8094
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:•This work proposes the Histogram-based Gradient Boosting Classification Tree for predicting liver diseases.•Proposed HGBoost, with the recursive feature selection algorithm, improved accuracy of 1–7% and reduced MSE.•The results of proposed algorithm will be more useful for physicians to make better decisions for liver disease patients. Healthcare is the key for everyone to run daily life, and health diagnosing techniques should be accessible easily. Indeed, the early identification of liver disease will be supportive for physicians to make decisions. Utilizing feature selection and classification approaches, this work aims to predict liver disorders through machine learning. The Histogram-based Gradient Boosting Classification Tree with a recursive feature selection algorithm (HGBoost) is proposed in this paper. The recursive feature selection approach and the Gradient Boosting are used to forecast liver disease. Using data from Indian liver patient records, the proposed HGBoost method has been assessed. Assessing the accuracy, confusion matrix, and area under curve involves implementing and comparing a variety of classification techniques, including MLP, Gboost, Adaboost, and proposed HGBoost algorithms. With the help of the recursive feature selection technique, the proposed HGBoost has surpassed other current algorithms. In comparison to the MLP, RF, Gboost, Adaboost, and proposed HGBoost algorithms, the enhanced accuracy is between 4 and 9% and between 1 and 7 % of the MSE error.
ISSN:1746-8094
DOI:10.1016/j.bspc.2024.107102