Optimization of decision trees using modified African buffalo algorithm

Decision tree induction is a simple, however powerful learning and classification tool to discover knowledge from the database. The volume of data in databases is growing to quite large sizes, both in the number of attributes and instances. Some important limitations of decision trees are instabilit...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Journal of King Saud University. Computer and information sciences Ročník 34; číslo 8; s. 4763 - 4772
Hlavní autori: Archana R. Panhalkar, Dharmpal D. Doye
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Springer 01.09.2022
Predmet:
ISSN:1319-1578
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Decision tree induction is a simple, however powerful learning and classification tool to discover knowledge from the database. The volume of data in databases is growing to quite large sizes, both in the number of attributes and instances. Some important limitations of decision trees are instability, local decisions, and overfitting for this extensive data. The simple, effective and non-convergence nature of the African Buffalo Optimization (ABO) algorithm makes it suitable to solve complex optimization problems. In this paper, we propose the African Buffalo Optimized Decision Tree (ABODT) algorithm to create globally optimized decision trees using the intelligent and collective behaviour of African Buffalos. The modified African Buffalo optimization algorithm is used to create efficient and optimal decision trees. To evaluate the efficiency of the proposed African Buffalo Optimized Decision Tree algorithm, experiments are performed on 15 standard UCI learning repository datasets that are of various sizes and domains. Results show that the African Buffalo Optimized Decision Tree algorithm globally optimizes decision trees, increases accuracy and reduces the size of a decision tree. These optimized trees are stable and efficient than conventional decision trees.
ISSN:1319-1578
DOI:10.1016/j.jksuci.2021.01.011