Optimal decision trees for categorical data via integer programming

Decision trees have been a very popular class of predictive models for decades due to their interpretability and good performance on categorical features. However, they are not always robust and tend to overfit the data. Additionally, if allowed to grow large, they lose interpretability. In this pap...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of global optimization Ročník 81; číslo 1; s. 233 - 260
Hlavní autoři: Günlük, Oktay, Kalagnanam, Jayant, Li, Minhan, Menickelly, Matt, Scheinberg, Katya
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York Springer US 01.09.2021
Springer
Springer Nature B.V
Témata:
ISSN:0925-5001, 1573-2916
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Decision trees have been a very popular class of predictive models for decades due to their interpretability and good performance on categorical features. However, they are not always robust and tend to overfit the data. Additionally, if allowed to grow large, they lose interpretability. In this paper, we present a mixed integer programming formulation to construct optimal decision trees of a prespecified size. We take the special structure of categorical features into account and allow combinatorial decisions (based on subsets of values of features) at each node. Our approach can also handle numerical features via thresholding. We show that very good accuracy can be achieved with small trees using moderately-sized training sets. The optimization problems we solve are tractable with modern solvers.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
USDOE
National Science Foundation (NSF)
AC02-06CH11357; CCF-1320137
ISSN:0925-5001
1573-2916
DOI:10.1007/s10898-021-01009-y