Learning customized and optimized lists of rules with mathematical programming

We introduce a mathematical programming approach to building rule lists, which are a type of interpretable, nonlinear, and logical machine learning classifier involving IF-THEN rules. Unlike traditional decision tree algorithms like CART and C5.0, this method does not use greedy splitting and prunin...

Full description

Saved in:
Bibliographic Details
Published in:Mathematical programming computation Vol. 10; no. 4; pp. 659 - 702
Main Authors: Rudin, Cynthia, Ertekin, Şeyda
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.12.2018
Springer Nature B.V
Subjects:
ISSN:1867-2949, 1867-2957
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract We introduce a mathematical programming approach to building rule lists, which are a type of interpretable, nonlinear, and logical machine learning classifier involving IF-THEN rules. Unlike traditional decision tree algorithms like CART and C5.0, this method does not use greedy splitting and pruning. Instead, it aims to fully optimize a combination of accuracy and sparsity, obeying user-defined constraints. This method is useful for producing non-black-box predictive models, and has the benefit of a clear user-defined tradeoff between training accuracy and sparsity. The flexible framework of mathematical programming allows users to create customized models with a provable guarantee of optimality. The software reviewed as part of this submission was given the DOI (Digital Object Identifier) https://doi.org/10.5281/zenodo.1344142 .
AbstractList We introduce a mathematical programming approach to building rule lists, which are a type of interpretable, nonlinear, and logical machine learning classifier involving IF-THEN rules. Unlike traditional decision tree algorithms like CART and C5.0, this method does not use greedy splitting and pruning. Instead, it aims to fully optimize a combination of accuracy and sparsity, obeying user-defined constraints. This method is useful for producing non-black-box predictive models, and has the benefit of a clear user-defined tradeoff between training accuracy and sparsity. The flexible framework of mathematical programming allows users to create customized models with a provable guarantee of optimality. The software reviewed as part of this submission was given the DOI (Digital Object Identifier) https://doi.org/10.5281/zenodo.1344142 .
We introduce a mathematical programming approach to building rule lists, which are a type of interpretable, nonlinear, and logical machine learning classifier involving IF-THEN rules. Unlike traditional decision tree algorithms like CART and C5.0, this method does not use greedy splitting and pruning. Instead, it aims to fully optimize a combination of accuracy and sparsity, obeying user-defined constraints. This method is useful for producing non-black-box predictive models, and has the benefit of a clear user-defined tradeoff between training accuracy and sparsity. The flexible framework of mathematical programming allows users to create customized models with a provable guarantee of optimality. The software reviewed as part of this submission was given the DOI (Digital Object Identifier) https://doi.org/10.5281/zenodo.1344142.
Author Rudin, Cynthia
Ertekin, Şeyda
Author_xml – sequence: 1
  givenname: Cynthia
  surname: Rudin
  fullname: Rudin, Cynthia
  email: cynthia@cs.duke.edu
  organization: Departments of Computer Science, Electrical and Computer Engineering, and Statistical Science, Duke University
– sequence: 2
  givenname: Şeyda
  surname: Ertekin
  fullname: Ertekin, Şeyda
  organization: Department of Computer Engineering, Middle Eastern Technical University, MIT Sloan School of Management, Massachusetts Institute of Technology
BookMark eNp9kF9LBCEUxSU2aNv2A_Qm9DyljjM6j7H0D5Z6qWdxRt11mdFNHaI-fS4TBUEJeq9wfudezimYOe80AOcYXWKE2FXEpCpJgTDPl5YFPwJzzGtWkKZis--eNidgGeMO5VMSxstmDh7XWgZn3QZ2Y0x-sB9aQekU9Ptkp19vY4rQGxjGXkf4ZtMWDjJtdX5sJ3u4D34T5DBklzNwbGQf9fKrLsDL7c3z6r5YP909rK7XRVdWTSq6ujVEtbTBslWkRrVkJW4rnAsyxJBGK4UYRRXnFUHSSEU5qpHSDZUdM6hcgIvJN89-HXVMYufH4PJIQTDmnNLsmlVsUnXBxxi0EZ1NeWnvUpC2FxiJQ35iyk_k_MQhP8EziX-R-2AHGd7_ZcjExKx1Gx1-dvob-gQTcISh
CitedBy_id crossref_primary_10_3847_1538_4357_ad2261
crossref_primary_10_1080_09593330_2023_2192877
crossref_primary_10_3390_math11224594
crossref_primary_10_1007_s10601_023_09348_1
crossref_primary_10_1016_j_ejor_2021_08_017
crossref_primary_10_1111_insr_12536
crossref_primary_10_1016_j_jclepro_2020_122181
crossref_primary_10_1214_22_AOS2171
crossref_primary_10_1007_s13369_021_06153_x
crossref_primary_10_3390_sym13122439
crossref_primary_10_1007_s44007_021_00003_w
crossref_primary_10_1007_s11750_021_00594_1
crossref_primary_10_1287_ijds_2021_0043
crossref_primary_10_1109_TVCG_2020_3045560
crossref_primary_10_1287_ijds_2021_0001
Cites_doi 10.1145/312129.312219
10.1145/3097983.3098161
10.1037/h0043158
10.1007/s10618-010-0174-x
10.1016/0743-1066(94)90035-3
10.1145/2020408.2020550
10.1007/978-3-540-87479-9_34
10.1145/1132960.1132963
10.1017/S0269888905000408
10.1080/01621459.1998.10473750
10.1214/15-AOAS848
10.1145/3097983.3098047
10.1007/s10994-017-5633-9
10.1006/jcss.1997.1504
10.1037/e526292010-001
10.1214/07-AOAS148
10.1111/biom.12354
10.1198/106186007X180426
10.1007/s10618-008-0089-y
10.1145/2623330.2623648
10.1145/1281192.1281250
10.1109/69.842268
10.1016/B978-1-55860-377-6.50023-2
10.1214/11-AOAS522
10.1017/CBO9780511809477.016
10.32614/CRAN.package.sbrl
10.1145/1656274.1656278
10.1214/10-AOAS367
10.1007/s10618-006-0059-1
10.1109/ISKE.2010.5680784
10.1023/A:1010933404324
10.1007/s10994-015-5528-6
10.32614/CRAN.package.C50
10.2333/bhmk.26.29
10.1145/2594473.2594475
10.1016/j.dss.2010.12.003
10.1017/S0269888907001026
10.1111/rssa.12227
10.1137/1.9781611972733.40
10.1007/978-3-319-59776-8_8
10.1145/360402.360421
ContentType Journal Article
Copyright Springer-Verlag GmbH Germany, part of Springer Nature and The Mathematical Programming Society 2018
Copyright Springer Science & Business Media 2018
Copyright_xml – notice: Springer-Verlag GmbH Germany, part of Springer Nature and The Mathematical Programming Society 2018
– notice: Copyright Springer Science & Business Media 2018
DBID AAYXX
CITATION
JQ2
DOI 10.1007/s12532-018-0143-8
DatabaseName CrossRef
ProQuest Computer Science Collection
DatabaseTitle CrossRef
ProQuest Computer Science Collection
DatabaseTitleList
ProQuest Computer Science Collection
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Mathematics
EISSN 1867-2957
EndPage 702
ExternalDocumentID 10_1007_s12532_018_0143_8
GroupedDBID -5D
-5G
-BR
-EM
-~C
06D
0R~
0VY
1N0
203
29M
2JY
2KG
2VQ
2~H
30V
4.4
406
408
409
40D
40E
6NX
8UJ
96X
AAAVM
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYQN
AAYTO
AAYZH
AAZMS
ABAKF
ABBXA
ABDZT
ABECU
ABFTV
ABHLI
ABHQN
ABJNI
ABJOX
ABKCH
ABMQK
ABQBU
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABWNU
ABXPI
ACAOD
ACDTI
ACGFS
ACHSB
ACIWK
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACPIV
ACZOJ
ADHHG
ADHIR
ADINQ
ADKNI
ADKPE
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFQL
AEGNC
AEJHL
AEJRE
AEKMD
AEMSY
AEOHA
AEPYU
AESKC
AEVLU
AEXYK
AFBBN
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWZB
AGYKE
AHAVH
AHBYD
AHKAY
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
ALFXC
ALMA_UNASSIGNED_HOLDINGS
AMKLP
AMXSW
AMYLF
AMYQR
ANMIH
AOCGG
ASPBG
AUKKA
AVWKF
AXYYD
AYJHY
AZFZN
BA0
BAPOH
BGNMA
CAG
COF
CSCUP
DDRTE
DNIVK
DPUIP
EBLON
EBS
EIOEI
EJD
ESBYG
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRRFC
FSGXE
FYJPI
GGCAI
GGRSB
GJIRD
GQ6
GQ7
GQ8
GXS
H13
HF~
HG6
HLICF
HMJXF
HQYDN
HRMNR
HZ~
I0C
IJ-
IKXTQ
IWAJR
IXC
IXD
IZIGR
I~X
J-C
J0Z
J9A
JBSCW
JCJTX
JZLTJ
KOV
LLZTM
M4Y
NPVJJ
NQJWS
NU0
O9-
O93
O9J
OAM
OK1
P9R
PT4
QOS
R89
RIG
RLLFE
ROL
RSV
S1Z
S27
S3B
SDH
SHX
SISQX
SJYHP
SMT
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
T13
TSG
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W48
WK8
Z45
Z83
ZMTXR
~A9
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
AEZWR
AFDZB
AFHIU
AFOHR
AHPBZ
AHWEU
AIXLP
ATHPR
AYFIA
CITATION
JQ2
ID FETCH-LOGICAL-c359t-c6bf2db491abd2606a731b51a730f2f29edd0740588520afad48060de94ac7f03
IEDL.DBID RSV
ISICitedReferencesCount 32
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000448393800007&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1867-2949
IngestDate Sun Nov 09 07:46:39 EST 2025
Tue Nov 18 21:05:19 EST 2025
Sat Nov 29 06:19:08 EST 2025
Fri Feb 21 02:33:17 EST 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 4
Keywords 68T05 Learning and adaptive systems
Learning and adaptive systems
Sparsity
90C11 Mixed integer programming
Mixed-integer programming
62-04 Explicit machine computation and programs (not the theory of computation or programming)
Decision lists
Interpretable modeling
Decision trees
Artificial intelligence
Associative classification 68T05—Computer Science
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c359t-c6bf2db491abd2606a731b51a730f2f29edd0740588520afad48060de94ac7f03
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
OpenAccessLink https://hdl.handle.net/11511/35666
PQID 2118844260
PQPubID 2044128
PageCount 44
ParticipantIDs proquest_journals_2118844260
crossref_citationtrail_10_1007_s12532_018_0143_8
crossref_primary_10_1007_s12532_018_0143_8
springer_journals_10_1007_s12532_018_0143_8
PublicationCentury 2000
PublicationDate 2018-12-01
PublicationDateYYYYMMDD 2018-12-01
PublicationDate_xml – month: 12
  year: 2018
  text: 2018-12-01
  day: 01
PublicationDecade 2010
PublicationPlace Berlin/Heidelberg
PublicationPlace_xml – name: Berlin/Heidelberg
– name: Heidelberg
PublicationSubtitle A Publication of the Mathematical Optimization Society
PublicationTitle Mathematical programming computation
PublicationTitleAbbrev Math. Prog. Comp
PublicationYear 2018
Publisher Springer Berlin Heidelberg
Springer Nature B.V
Publisher_xml – name: Springer Berlin Heidelberg
– name: Springer Nature B.V
References McGarryKA survey of interestingness measures for knowledge discoveryKnowl. Eng. Rev.200520396110.1017/S0269888905000408
MarchandMSokolovaMLearning with decision lists of data-dependent featuresJ. Mach. Learn. Res.2005642745122498271222.68257
PlateTAAccuracy versus interpretability in flexible modeling: implementing a tradeoff using gaussian process modelsBehaviormetrika199926295010.2333/bhmk.26.29
NaumovGNP-completeness of problems of construction of optimal decision treesSov. Phys. Dokl.19913642702710800.68856
Simon, G.J., Kumar, V., Li, P.W.: A simple statistical model and association rule filtering for classification. In: Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 823–831 (2011)
Rüping, S.: Learning interpretable models. Ph.D. thesis, Universität Dortmund (2006)
Yang, H., Rudin, C., Seltzer, M.: Scalable Bayesian rule lists. In: Proceedings of the 34th International Conference on Machine Learning (ICML) (2017)
Angelino, E., Larus-Stone, N., Alabi, D., Seltzer, M., Rudin, C.: Learning certifiably optimal rule lists for categorical data. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD) (2017)
QuinlanJRC4.5: Programs for Machine Learning1993Los AltosMorgan Kaufmann
HippJGüntzerUNakhaeizadehGAlgorithms for association rule mining: a general survey and comparisonSIGKDD Explor.20002586410.1145/360402.360421
Cusick, G.R., Courtney, M.E., Havlicek, J., Hess, N.: Crime during the transition to adulthood: how youth fare as they leave out-of-home care. National Institute of Justice, Office of Justice Programs, US Department of Justice (2010)
VapnikVStatistical Learning Theory1998New YorkWiley0935.62007
Bennett, K.P., Blue, J.A.: Optimal decision trees. Tech. rep., R.P.I. Math Report No. 214, Rensselaer Polytechnic Institute (1996)
LeondesCTExpert Systems: The Technology of Knowledge Management and Decision Making for the 21st Century2002LondonAcademic Press
Tan, P.N., Kumar, V.: Interestingness measures for association patterns: a perspective. Tech. rep., Department of Computer Science, University of Minnesota (2000)
Agrawal, R., Srikant, R.: Fast algorithms for mining association rules. In: Proceedings of the 20th International Conference on Very Large Databases, pp. 487–499 (1994)
Cohen, W.W.: Fast effective rule induction. In: Proceedings of the Twelfth International Conference on Machine Learning, pp. 115–123. Morgan Kaufmann (1995)
McCormickTHRudinCMadiganDBayesian hierarchical modeling for predicting medical conditionsAnn. Appl. Stat.201262652668297648610.1214/11-AOAS522
RidgewayGThe pitfalls of predictionNIJ J. Natl. Inst. Justice20132713440
BreimanLFriedmanJHOlshenRAStoneCJClassification and Regression Trees1984BelmontWadsworth0541.62042
Vanhoof, K., Depaire, B.: Structure of association rule classifiers: a review. In: Proceedings of the International Conference on Intelligent Systems and Knowledge Engineering (ISKE), pp. 9–12 (2010)
FriedmanJHPopescuBEPredictive learning via rule ensemblesAnn. Appl. Stat.200823916954252217510.1214/07-AOAS148
WuYTjelmelandHWestMBayesian CART: prior specification and posterior simulationJ. Comput. Graph. Stat.20071614466234574710.1198/106186007X180426
LethamBRudinCMcCormickTHMadiganDInterpretable classifiers using rules and bayesian analysis: building a better stroke prediction modelAnn. Appl. Stat.20159313501371341872610.1214/15-AOAS848
Lakkaraju, H., Rudin, C.: Learning cost effective and interpretable treatment regimes in the form of rule lists. In: Proceedings of Artificial Intelligence and Statistics (AISTATS) (2017)
Rudin, C., Letham, B., Salleb-Aouissi, A., Kogan, E., Madigan, D.: Sequential event prediction with association rules. In: Proceedings of the 24th Annual Conference on Learning Theory (COLT) (2011)
NorouziMCollinsMJohnsonMAFleetDJKohliPEfficient non-greedy optimization of decision treesAdv. Neural Inf. Process. Syst.20152817291737
Hata, I., Veloso, A., Ziviani, N.: Learning accurate and interpretable classifiers using optimal multi-criteria rules. J. Inf. Data Manag. 4(3) (2013)
MuggletonSDe RaedtLInductive logic programming: theory and methodsJ. Log. Program.199419629679127993610.1016/0743-1066(94)90035-3
HanJChengHXinDYanXFrequent pattern mining: current status and future directionsData Min. Knowl. Discov.2007155586238077710.1007/s10618-006-0059-1
KlivansARServedioRAToward attribute efficient learning of decision lists and paritiesJ. Mach. Learn. Res.2006758760222743791222.68087
Liu, B., Hsu, W., Ma, Y.: Integrating classification and association rule mining. In: Proceedings of the 4th International Conference on Knowledge Discovery and Data Mining, pp. 80–96 (1998)
Vellido, A., Martín-Guerrero, J.D., Lisboa, P.J.: Making machine learning models interpretable. In: Proceedings of the European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (2012)
Goh, S.T., Rudin, C.: Box drawings for learning with imbalanced data. In: Proceedings of the 20th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) (2014)
Chang, A.: Integer optimization methods for machine learning. Ph.D. thesis, Massachusetts Institute of Technology (2012)
BreimanLRandom forestsMach Learn200145153210.1023/A:1010933404324
ChipmanHAGeorgeEIMcCullochREBayesian CART model searchJ. Am. Stat. Assoc.19989344393594810.1080/01621459.1998.10473750
HuysmansJDejaegerKMuesCVanthienenJBaesensBAn empirical evaluation of the comprehensibility of decision table, tree and rule based predictive modelsDecis. Support Syst.201151114115410.1016/j.dss.2010.12.003
Anthony, M.: Decision lists. Tech. rep., CDAM Research Report LSE-CDAM-2005-23 (2005)
YinXiaoxinHanJiaweiCPAR: Classification based on Predictive Association RulesProceedings of the 2003 SIAM International Conference on Data Mining2003Philadelphia, PASociety for Industrial and Applied Mathematics33133510.1137/1.9781611972733.40
Nijssen, S., Fromont, E.: Mining optimal decision trees from itemset lattices. In: Proceedings of the ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) (2007)
Verwer, S., Zhang, Y.: Learning decision trees with flexible constraints and objectives using integer optimization In: Salvagnin, D., Lombardi, M. (eds.) Integration of AI and OR Techniques in Constraint Programming. CPAIOR 2017. Lecture Notes in Computer Science, vol. 10335, pp 94–103. Springer (2017)
GengLHamiltonHJInterestingness measures for data mining: a surveyACM Comput. Surv.200610.1145/1132960.1132963
Bayardo, R.J., Agrawal, R.: Mining the most interesting rules. In: Proceedings of the 5th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 145–154 (1999)
Rückert, U.: A statistical approach to rule learning. Ph.D. thesis, Technischen Universität München (2008)
NijssenSFromontEOptimal constraint-based decision tree induction from itemset latticesData Min. Knowl. Discov.2010211951272051210.1007/s10618-010-0174-x
Malioutov, D., Varshney, K.: Exact rule learning via boolean compressed sensing. In: Proceedings of The 30th International Conference on Machine Learning, pp. 765–773 (2013)
RudinCLethamBMadiganDLearning theory analysis for association rules and sequential event predictionJ. Mach. Learn. Res.2013143384343631444681317.68184
UstunBRudinCSupersparse linear integer models for optimized medical scoring systemsMach. Learn.20161023349391346309310.1007/s10994-015-5528-6
MillerGAThe magical number seven, plus or minus two: Some limits to our capacity for processing informationPsychol. Rev.1956632819710.1037/h0043158
WangTRudinCDoshi-VelezFLiuYKlampflEMacNeillePA Bayesian framework for learning rule sets for interpretable classificationJ. Mach. Learn. Res.20171870137371423306860775
FreitasAAComprehensible classification models: a position paperACM SIGKDD Explor. Newsl.201415111010.1145/2594473.2594475
Kuhn, M., Weston, S., Coulter, N.: C50: C5.0 Decision Trees and Rule-Based Models, C Code for C5.0 by R. Quinlan. http://CRAN.R-project.org/package=C50. r package version 0.1.0-013 (2012)
ZengJUstunBRudinCInterpretable classification models for recidivism predictionJ. R. Stat. Soc. Ser. A (Stat. Soc.)20171803689722366015710.1111/rssa.12227
ThabtahFA review of associative classification miningKnowl. Eng. Rev.200722376510.1017/S0269888907001026
Wang, F., Rudin, C.: Falling rule lists. In: Proceedings of Artificial Intelligence and Statistics (AISTATS) (2015)
Farhangfar, A., Greiner, R., Zinkevich, M.: A fast way to produce optimal fixed-depth decision trees. In: International Symposium on Artificial Intelligence and Mathematics (ISAIM 2008), Fort Lauderdale, Florida, USA, January 2–4 (2008)
BorosEHammerPLIbarakiTKoganAMayorazEMuchnikIAn implementation of logical analysis of dataIEEE Trans. Knowl. Data Eng.200012229230610.1109/69.842268
Li, W., Han, J., Pei, J.: CMAR: Accurate and efficient classification based on multiple class-association rules. IEEE International Conference on Data Mining, pp. 369–376 (2001)
Bache, K., Lichman, M.: UCI machine learning repository. http://archive.ics.uci.edu/ml (2013)
Dobkin, D., Fulton, T., Gunopulos, D., Kasif, S., Salzberg, S.: Induction of shallow decision trees (1996)
RivestRLLearning decision listsMach. Learn.198723229246
ZhangYLaberEBTsiatisADavidianMUsing decision lists to construct interpretable and parsimonious treatment regimesBiometrics2015714895904343671510.1111/biom.12354
FreundYSchapireREA decision-theoretic generalization of on-line learning and an application to boostingJ. Comput. Syst. Sci.1997551119139147305510.1006/jcss.1997.1504
CieslakDAChawlaNVDaelemansWGoethalsBMorikKLearning decision trees for unbalanced dataMachine Learning and Knowledge Discovery in Databases. Lecture Notes in Computer Science2008BerlinSpringer24125610.1007/978-3-540-87479-9_34
Su, G., Wei, D., Varshney, K.R., Malioutov, D.M.: Interpretable two-level boolean rule learning for classification. In: ICML Workshop on Human Interpretability in Machine Learning (WHI 2016) (2016). arXiv:1606.05798
JenningsDL
143_CR12
143_CR56
RL Rivest (143_CR55) 1987; 2
J Huysmans (143_CR31) 2011; 51
T Wang (143_CR71) 2017; 18
143_CR6
143_CR7
Y Freund (143_CR22) 1997; 55
J Han (143_CR28) 2007; 15
TA Plate (143_CR52) 1999; 26
L Breiman (143_CR11) 1984
HA Chipman (143_CR14) 1998; 93
M Marchand (143_CR42) 2005; 6
N Meinshausen (143_CR45) 2010; 4
143_CR1
143_CR2
143_CR49
143_CR4
143_CR5
G Naumov (143_CR48) 1991; 36
CT Leondes (143_CR36) 2002
143_CR41
DA Cieslak (143_CR15) 2008
M Norouzi (143_CR51) 2015; 28
B Ustun (143_CR64) 2016; 102
D Bertsimas (143_CR8) 2017; 7
T Fawcett (143_CR20) 2008; 17
E Boros (143_CR9) 2000; 12
M Hall (143_CR27) 2009; 11
J Hipp (143_CR30) 2000; 2
TH McCormick (143_CR43) 2012; 6
S Muggleton (143_CR47) 1994; 19
J Zeng (143_CR75) 2017; 180
143_CR39
143_CR38
143_CR35
JH Friedman (143_CR23) 2008; 2
DL Jennings (143_CR32) 1982
143_CR34
L Breiman (143_CR10) 2001; 45
B Letham (143_CR37) 2015; 9
143_CR73
C Rudin (143_CR58) 2013; 14
AR Klivans (143_CR33) 2006; 7
143_CR70
V Vapnik (143_CR67) 1998
L Geng (143_CR24) 2006
E Angelino (143_CR3) 2018; 18
AA Freitas (143_CR21) 2014; 15
JR Quinlan (143_CR53) 1993
143_CR29
Y Wu (143_CR72) 2007; 16
143_CR26
143_CR25
143_CR69
143_CR68
143_CR66
143_CR65
143_CR62
143_CR61
143_CR60
F Thabtah (143_CR63) 2007; 22
Y Zhang (143_CR76) 2015; 71
K McGarry (143_CR44) 2005; 20
GA Miller (143_CR46) 1956; 63
143_CR19
143_CR18
G Ridgeway (143_CR54) 2013; 271
143_CR17
143_CR16
S Nijssen (143_CR50) 2010; 21
143_CR59
Xiaoxin Yin (143_CR74) 2003
143_CR13
PM Long (143_CR40) 2007; 19
143_CR57
References_xml – reference: Verwer, S., Zhang, Y.: Learning decision trees with flexible constraints and objectives using integer optimization In: Salvagnin, D., Lombardi, M. (eds.) Integration of AI and OR Techniques in Constraint Programming. CPAIOR 2017. Lecture Notes in Computer Science, vol. 10335, pp 94–103. Springer (2017)
– reference: UstunBRudinCSupersparse linear integer models for optimized medical scoring systemsMach. Learn.20161023349391346309310.1007/s10994-015-5528-6
– reference: HallMFrankEHolmesGPfahringerBReutemannPWittenIHThe weka data mining software: an updateSIGKDD Explor. Newsl.2009111101810.1145/1656274.1656278
– reference: BertsimasDDunnJOptimal classification treesMach. Learn.2017710391082366578810.1007/s10994-017-5633-9
– reference: Chang, A.: Integer optimization methods for machine learning. Ph.D. thesis, Massachusetts Institute of Technology (2012)
– reference: Vanhoof, K., Depaire, B.: Structure of association rule classifiers: a review. In: Proceedings of the International Conference on Intelligent Systems and Knowledge Engineering (ISKE), pp. 9–12 (2010)
– reference: Nijssen, S., Fromont, E.: Mining optimal decision trees from itemset lattices. In: Proceedings of the ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) (2007)
– reference: Farhangfar, A., Greiner, R., Zinkevich, M.: A fast way to produce optimal fixed-depth decision trees. In: International Symposium on Artificial Intelligence and Mathematics (ISAIM 2008), Fort Lauderdale, Florida, USA, January 2–4 (2008)
– reference: HanJChengHXinDYanXFrequent pattern mining: current status and future directionsData Min. Knowl. Discov.2007155586238077710.1007/s10618-006-0059-1
– reference: Chen, C., Rudin, C.: An optimization approach to learning falling rule lists. In: Proceedings of Artificial Intelligence and Statistics (AISTATS) (2018)
– reference: McGarryKA survey of interestingness measures for knowledge discoveryKnowl. Eng. Rev.200520396110.1017/S0269888905000408
– reference: Bache, K., Lichman, M.: UCI machine learning repository. http://archive.ics.uci.edu/ml (2013)
– reference: CieslakDAChawlaNVDaelemansWGoethalsBMorikKLearning decision trees for unbalanced dataMachine Learning and Knowledge Discovery in Databases. Lecture Notes in Computer Science2008BerlinSpringer24125610.1007/978-3-540-87479-9_34
– reference: FawcettTPrie: a system for generating rulelists to maximize roc performanceData Min. Knowl. Discov.2008172207224243476410.1007/s10618-008-0089-y
– reference: ZhangYLaberEBTsiatisADavidianMUsing decision lists to construct interpretable and parsimonious treatment regimesBiometrics2015714895904343671510.1111/biom.12354
– reference: NaumovGNP-completeness of problems of construction of optimal decision treesSov. Phys. Dokl.19913642702710800.68856
– reference: Kuhn, M., Weston, S., Coulter, N.: C50: C5.0 Decision Trees and Rule-Based Models, C Code for C5.0 by R. Quinlan. http://CRAN.R-project.org/package=C50. r package version 0.1.0-013 (2012)
– reference: Ustun, B., Rudin, C.: Optimized risk scores. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2017)
– reference: RudinCLethamBMadiganDLearning theory analysis for association rules and sequential event predictionJ. Mach. Learn. Res.2013143384343631444681317.68184
– reference: Wang, F., Rudin, C.: Falling rule lists. In: Proceedings of Artificial Intelligence and Statistics (AISTATS) (2015)
– reference: YinXiaoxinHanJiaweiCPAR: Classification based on Predictive Association RulesProceedings of the 2003 SIAM International Conference on Data Mining2003Philadelphia, PASociety for Industrial and Applied Mathematics33133510.1137/1.9781611972733.40
– reference: Malioutov, D., Varshney, K.: Exact rule learning via boolean compressed sensing. In: Proceedings of The 30th International Conference on Machine Learning, pp. 765–773 (2013)
– reference: ChipmanHAGeorgeEIMcCullochREBayesian CART model searchJ. Am. Stat. Assoc.19989344393594810.1080/01621459.1998.10473750
– reference: Hata, I., Veloso, A., Ziviani, N.: Learning accurate and interpretable classifiers using optimal multi-criteria rules. J. Inf. Data Manag. 4(3) (2013)
– reference: ZengJUstunBRudinCInterpretable classification models for recidivism predictionJ. R. Stat. Soc. Ser. A (Stat. Soc.)20171803689722366015710.1111/rssa.12227
– reference: Agrawal, R., Srikant, R.: Fast algorithms for mining association rules. In: Proceedings of the 20th International Conference on Very Large Databases, pp. 487–499 (1994)
– reference: Yang, H., Rudin, C., Seltzer, M.: Scalable Bayesian rule lists. In: Proceedings of the 34th International Conference on Machine Learning (ICML) (2017)
– reference: Su, G., Wei, D., Varshney, K.R., Malioutov, D.M.: Interpretable two-level boolean rule learning for classification. In: ICML Workshop on Human Interpretability in Machine Learning (WHI 2016) (2016). arXiv:1606.05798
– reference: FreundYSchapireREA decision-theoretic generalization of on-line learning and an application to boostingJ. Comput. Syst. Sci.1997551119139147305510.1006/jcss.1997.1504
– reference: MillerGAThe magical number seven, plus or minus two: Some limits to our capacity for processing informationPsychol. Rev.1956632819710.1037/h0043158
– reference: MuggletonSDe RaedtLInductive logic programming: theory and methodsJ. Log. Program.199419629679127993610.1016/0743-1066(94)90035-3
– reference: BreimanLRandom forestsMach Learn200145153210.1023/A:1010933404324
– reference: HippJGüntzerUNakhaeizadehGAlgorithms for association rule mining: a general survey and comparisonSIGKDD Explor.20002586410.1145/360402.360421
– reference: HuysmansJDejaegerKMuesCVanthienenJBaesensBAn empirical evaluation of the comprehensibility of decision table, tree and rule based predictive modelsDecis. Support Syst.201151114115410.1016/j.dss.2010.12.003
– reference: Li, W., Han, J., Pei, J.: CMAR: Accurate and efficient classification based on multiple class-association rules. IEEE International Conference on Data Mining, pp. 369–376 (2001)
– reference: Tan, P.N., Kumar, V.: Interestingness measures for association patterns: a perspective. Tech. rep., Department of Computer Science, University of Minnesota (2000)
– reference: Lakkaraju, H., Rudin, C.: Learning cost effective and interpretable treatment regimes in the form of rule lists. In: Proceedings of Artificial Intelligence and Statistics (AISTATS) (2017)
– reference: Goh, S.T., Rudin, C.: Box drawings for learning with imbalanced data. In: Proceedings of the 20th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD) (2014)
– reference: JenningsDLAmabileTMRossLKahnemanDSlovicPTverskyAInformal covariation assessments: Data-based versus theory-based judgementsJudgment Under Uncertainty: Heuristics and Biases1982CambridgeCambridge Press21123010.1017/CBO9780511809477.016
– reference: NijssenSFromontEOptimal constraint-based decision tree induction from itemset latticesData Min. Knowl. Discov.2010211951272051210.1007/s10618-010-0174-x
– reference: Rückert, U.: A statistical approach to rule learning. Ph.D. thesis, Technischen Universität München (2008)
– reference: MarchandMSokolovaMLearning with decision lists of data-dependent featuresJ. Mach. Learn. Res.2005642745122498271222.68257
– reference: RivestRLLearning decision listsMach. Learn.198723229246
– reference: Bennett, K.P., Blue, J.A.: Optimal decision trees. Tech. rep., R.P.I. Math Report No. 214, Rensselaer Polytechnic Institute (1996)
– reference: WangTRudinCDoshi-VelezFLiuYKlampflEMacNeillePA Bayesian framework for learning rule sets for interpretable classificationJ. Mach. Learn. Res.20171870137371423306860775
– reference: Dobkin, D., Fulton, T., Gunopulos, D., Kasif, S., Salzberg, S.: Induction of shallow decision trees (1996)
– reference: WuYTjelmelandHWestMBayesian CART: prior specification and posterior simulationJ. Comput. Graph. Stat.20071614466234574710.1198/106186007X180426
– reference: Liu, B., Hsu, W., Ma, Y.: Integrating classification and association rule mining. In: Proceedings of the 4th International Conference on Knowledge Discovery and Data Mining, pp. 80–96 (1998)
– reference: MeinshausenNNode harvestAnn. Appl. Stat.20104420492072282994610.1214/10-AOAS367
– reference: LeondesCTExpert Systems: The Technology of Knowledge Management and Decision Making for the 21st Century2002LondonAcademic Press
– reference: QuinlanJRC4.5: Programs for Machine Learning1993Los AltosMorgan Kaufmann
– reference: Angelino, E., Larus-Stone, N., Alabi, D., Seltzer, M., Rudin, C.: Learning certifiably optimal rule lists for categorical data. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD) (2017)
– reference: NorouziMCollinsMJohnsonMAFleetDJKohliPEfficient non-greedy optimization of decision treesAdv. Neural Inf. Process. Syst.20152817291737
– reference: ThabtahFA review of associative classification miningKnowl. Eng. Rev.200722376510.1017/S0269888907001026
– reference: Anthony, M.: Decision lists. Tech. rep., CDAM Research Report LSE-CDAM-2005-23 (2005)
– reference: Bayardo, R.J., Agrawal, R.: Mining the most interesting rules. In: Proceedings of the 5th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 145–154 (1999)
– reference: LethamBRudinCMcCormickTHMadiganDInterpretable classifiers using rules and bayesian analysis: building a better stroke prediction modelAnn. Appl. Stat.20159313501371341872610.1214/15-AOAS848
– reference: FreitasAAComprehensible classification models: a position paperACM SIGKDD Explor. Newsl.201415111010.1145/2594473.2594475
– reference: AngelinoELarus-StoneNAlabiDSeltzerMRudinCLearning certifiably optimal rule lists for categorical dataJ. Mach. Learn. Res.201818178
– reference: McCormickTHRudinCMadiganDBayesian hierarchical modeling for predicting medical conditionsAnn. Appl. Stat.201262652668297648610.1214/11-AOAS522
– reference: BorosEHammerPLIbarakiTKoganAMayorazEMuchnikIAn implementation of logical analysis of dataIEEE Trans. Knowl. Data Eng.200012229230610.1109/69.842268
– reference: GengLHamiltonHJInterestingness measures for data mining: a surveyACM Comput. Surv.200610.1145/1132960.1132963
– reference: Cohen, W.W.: Fast effective rule induction. In: Proceedings of the Twelfth International Conference on Machine Learning, pp. 115–123. Morgan Kaufmann (1995)
– reference: Cusick, G.R., Courtney, M.E., Havlicek, J., Hess, N.: Crime during the transition to adulthood: how youth fare as they leave out-of-home care. National Institute of Justice, Office of Justice Programs, US Department of Justice (2010)
– reference: FriedmanJHPopescuBEPredictive learning via rule ensemblesAnn. Appl. Stat.200823916954252217510.1214/07-AOAS148
– reference: BreimanLFriedmanJHOlshenRAStoneCJClassification and Regression Trees1984BelmontWadsworth0541.62042
– reference: Vellido, A., Martín-Guerrero, J.D., Lisboa, P.J.: Making machine learning models interpretable. In: Proceedings of the European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (2012)
– reference: RidgewayGThe pitfalls of predictionNIJ J. Natl. Inst. Justice20132713440
– reference: Rüping, S.: Learning interpretable models. Ph.D. thesis, Universität Dortmund (2006)
– reference: Rudin, C., Letham, B., Salleb-Aouissi, A., Kogan, E., Madigan, D.: Sequential event prediction with association rules. In: Proceedings of the 24th Annual Conference on Learning Theory (COLT) (2011)
– reference: Simon, G.J., Kumar, V., Li, P.W.: A simple statistical model and association rule filtering for classification. In: Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 823–831 (2011)
– reference: LongPMServedioRAAttribute-efficient learning of decision lists and linear threshold functions under unconcentrated distributionsAdv. Neural Inf. Process. Syst.200719921928
– reference: Goethals, B.: Survey on frequent pattern mining. Tech. rep., Helsinki Institute for Information Technology (2003)
– reference: KlivansARServedioRAToward attribute efficient learning of decision lists and paritiesJ. Mach. Learn. Res.2006758760222743791222.68087
– reference: VapnikVStatistical Learning Theory1998New YorkWiley0935.62007
– reference: PlateTAAccuracy versus interpretability in flexible modeling: implementing a tradeoff using gaussian process modelsBehaviormetrika199926295010.2333/bhmk.26.29
– ident: 143_CR6
  doi: 10.1145/312129.312219
– ident: 143_CR65
  doi: 10.1145/3097983.3098161
– volume: 63
  start-page: 81
  issue: 2
  year: 1956
  ident: 143_CR46
  publication-title: Psychol. Rev.
  doi: 10.1037/h0043158
– volume: 36
  start-page: 270
  issue: 4
  year: 1991
  ident: 143_CR48
  publication-title: Sov. Phys. Dokl.
– volume: 18
  start-page: 1
  year: 2018
  ident: 143_CR3
  publication-title: J. Mach. Learn. Res.
– volume: 21
  start-page: 9
  issue: 1
  year: 2010
  ident: 143_CR50
  publication-title: Data Min. Knowl. Discov.
  doi: 10.1007/s10618-010-0174-x
– volume: 19
  start-page: 629
  year: 1994
  ident: 143_CR47
  publication-title: J. Log. Program.
  doi: 10.1016/0743-1066(94)90035-3
– volume-title: C4.5: Programs for Machine Learning
  year: 1993
  ident: 143_CR53
– ident: 143_CR60
  doi: 10.1145/2020408.2020550
– ident: 143_CR57
– start-page: 241
  volume-title: Machine Learning and Knowledge Discovery in Databases. Lecture Notes in Computer Science
  year: 2008
  ident: 143_CR15
  doi: 10.1007/978-3-540-87479-9_34
– ident: 143_CR19
– ident: 143_CR38
– year: 2006
  ident: 143_CR24
  publication-title: ACM Comput. Surv.
  doi: 10.1145/1132960.1132963
– volume: 6
  start-page: 427
  year: 2005
  ident: 143_CR42
  publication-title: J. Mach. Learn. Res.
– volume: 28
  start-page: 1729
  year: 2015
  ident: 143_CR51
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 20
  start-page: 39
  year: 2005
  ident: 143_CR44
  publication-title: Knowl. Eng. Rev.
  doi: 10.1017/S0269888905000408
– volume: 2
  start-page: 229
  issue: 3
  year: 1987
  ident: 143_CR55
  publication-title: Mach. Learn.
– ident: 143_CR25
– volume: 93
  start-page: 935
  issue: 443
  year: 1998
  ident: 143_CR14
  publication-title: J. Am. Stat. Assoc.
  doi: 10.1080/01621459.1998.10473750
– ident: 143_CR29
– volume: 9
  start-page: 1350
  issue: 3
  year: 2015
  ident: 143_CR37
  publication-title: Ann. Appl. Stat.
  doi: 10.1214/15-AOAS848
– ident: 143_CR68
– ident: 143_CR2
  doi: 10.1145/3097983.3098047
– volume: 7
  start-page: 1039
  year: 2017
  ident: 143_CR8
  publication-title: Mach. Learn.
  doi: 10.1007/s10994-017-5633-9
– volume: 55
  start-page: 119
  issue: 1
  year: 1997
  ident: 143_CR22
  publication-title: J. Comput. Syst. Sci.
  doi: 10.1006/jcss.1997.1504
– ident: 143_CR17
  doi: 10.1037/e526292010-001
– ident: 143_CR12
– ident: 143_CR5
– ident: 143_CR70
– volume-title: Expert Systems: The Technology of Knowledge Management and Decision Making for the 21st Century
  year: 2002
  ident: 143_CR36
– volume: 7
  start-page: 587
  year: 2006
  ident: 143_CR33
  publication-title: J. Mach. Learn. Res.
– volume-title: Statistical Learning Theory
  year: 1998
  ident: 143_CR67
– volume: 2
  start-page: 916
  issue: 3
  year: 2008
  ident: 143_CR23
  publication-title: Ann. Appl. Stat.
  doi: 10.1214/07-AOAS148
– volume: 71
  start-page: 895
  issue: 4
  year: 2015
  ident: 143_CR76
  publication-title: Biometrics
  doi: 10.1111/biom.12354
– volume-title: Classification and Regression Trees
  year: 1984
  ident: 143_CR11
– volume: 14
  start-page: 3384
  year: 2013
  ident: 143_CR58
  publication-title: J. Mach. Learn. Res.
– volume: 16
  start-page: 44
  issue: 1
  year: 2007
  ident: 143_CR72
  publication-title: J. Comput. Graph. Stat.
  doi: 10.1198/106186007X180426
– ident: 143_CR4
– volume: 17
  start-page: 207
  issue: 2
  year: 2008
  ident: 143_CR20
  publication-title: Data Min. Knowl. Discov.
  doi: 10.1007/s10618-008-0089-y
– ident: 143_CR26
  doi: 10.1145/2623330.2623648
– ident: 143_CR49
  doi: 10.1145/1281192.1281250
– ident: 143_CR59
– ident: 143_CR13
– volume: 12
  start-page: 292
  issue: 2
  year: 2000
  ident: 143_CR9
  publication-title: IEEE Trans. Knowl. Data Eng.
  doi: 10.1109/69.842268
– ident: 143_CR16
  doi: 10.1016/B978-1-55860-377-6.50023-2
– ident: 143_CR1
– volume: 6
  start-page: 652
  issue: 2
  year: 2012
  ident: 143_CR43
  publication-title: Ann. Appl. Stat.
  doi: 10.1214/11-AOAS522
– volume: 271
  start-page: 34
  year: 2013
  ident: 143_CR54
  publication-title: NIJ J. Natl. Inst. Justice
– ident: 143_CR61
– volume: 18
  start-page: 1
  issue: 70
  year: 2017
  ident: 143_CR71
  publication-title: J. Mach. Learn. Res.
– start-page: 211
  volume-title: Judgment Under Uncertainty: Heuristics and Biases
  year: 1982
  ident: 143_CR32
  doi: 10.1017/CBO9780511809477.016
– ident: 143_CR73
  doi: 10.32614/CRAN.package.sbrl
– volume: 11
  start-page: 10
  issue: 1
  year: 2009
  ident: 143_CR27
  publication-title: SIGKDD Explor. Newsl.
  doi: 10.1145/1656274.1656278
– volume: 4
  start-page: 2049
  issue: 4
  year: 2010
  ident: 143_CR45
  publication-title: Ann. Appl. Stat.
  doi: 10.1214/10-AOAS367
– volume: 15
  start-page: 55
  year: 2007
  ident: 143_CR28
  publication-title: Data Min. Knowl. Discov.
  doi: 10.1007/s10618-006-0059-1
– ident: 143_CR66
  doi: 10.1109/ISKE.2010.5680784
– ident: 143_CR18
– volume: 45
  start-page: 5
  issue: 1
  year: 2001
  ident: 143_CR10
  publication-title: Mach Learn
  doi: 10.1023/A:1010933404324
– volume: 102
  start-page: 349
  issue: 3
  year: 2016
  ident: 143_CR64
  publication-title: Mach. Learn.
  doi: 10.1007/s10994-015-5528-6
– ident: 143_CR7
– ident: 143_CR34
  doi: 10.32614/CRAN.package.C50
– volume: 26
  start-page: 29
  year: 1999
  ident: 143_CR52
  publication-title: Behaviormetrika
  doi: 10.2333/bhmk.26.29
– volume: 15
  start-page: 1
  issue: 1
  year: 2014
  ident: 143_CR21
  publication-title: ACM SIGKDD Explor. Newsl.
  doi: 10.1145/2594473.2594475
– ident: 143_CR39
– volume: 51
  start-page: 141
  issue: 1
  year: 2011
  ident: 143_CR31
  publication-title: Decis. Support Syst.
  doi: 10.1016/j.dss.2010.12.003
– ident: 143_CR35
– ident: 143_CR56
– volume: 22
  start-page: 37
  year: 2007
  ident: 143_CR63
  publication-title: Knowl. Eng. Rev.
  doi: 10.1017/S0269888907001026
– volume: 19
  start-page: 921
  year: 2007
  ident: 143_CR40
  publication-title: Adv. Neural Inf. Process. Syst.
– volume: 180
  start-page: 689
  issue: 3
  year: 2017
  ident: 143_CR75
  publication-title: J. R. Stat. Soc. Ser. A (Stat. Soc.)
  doi: 10.1111/rssa.12227
– start-page: 331
  volume-title: Proceedings of the 2003 SIAM International Conference on Data Mining
  year: 2003
  ident: 143_CR74
  doi: 10.1137/1.9781611972733.40
– ident: 143_CR69
  doi: 10.1007/978-3-319-59776-8_8
– ident: 143_CR41
– volume: 2
  start-page: 58
  year: 2000
  ident: 143_CR30
  publication-title: SIGKDD Explor.
  doi: 10.1145/360402.360421
– ident: 143_CR62
SSID ssj0000327839
Score 2.3580759
Snippet We introduce a mathematical programming approach to building rule lists, which are a type of interpretable, nonlinear, and logical machine learning classifier...
SourceID proquest
crossref
springer
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 659
SubjectTerms Accuracy
Decision trees
Digital Object Identifier
Full Length Paper
Machine learning
Mathematical analysis
Mathematical models
Mathematical programming
Mathematics
Mathematics and Statistics
Mathematics of Computing
Operations Research/Decision Theory
Optimization
Pruning
Sparsity
Theory of Computation
Title Learning customized and optimized lists of rules with mathematical programming
URI https://link.springer.com/article/10.1007/s12532-018-0143-8
https://www.proquest.com/docview/2118844260
Volume 10
WOSCitedRecordID wos000448393800007&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAVX
  databaseName: SpringerLINK Contemporary 1997-Present
  customDbUrl:
  eissn: 1867-2957
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000327839
  issn: 1867-2949
  databaseCode: RSV
  dateStart: 20090701
  isFulltext: true
  titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22
  providerName: Springer Nature
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LT8MwDI5gcIADb8RgoBw4gSJladImR4SYOMCEeEy7VXmiSVuH1o0Dv56ka1dAgASnKmrqVrab2LH9GYBTxqT3qxKJFMUSUWEtEjZWKBKxt44xI0lRlda7Sbpd3u-Lu7KOO6-y3auQZLFS18VuhEUhjSAkX9EI8WWw4nc7Hvo13D_0FgcrOArNI4LZG7DaEBFUVNHM76h83o9qI_NLXLTYbjqb__rQLbBRWpfwYq4O22DJZjtg_QPmoB_dLoBa813QLfFVn6GeeTNwNHizBsrMwLFfSuajoVeEHI4dnMyGNofh4BaOFjT828oUr5GnsgeeOlePl9eobLGAdMTEFOlYOWIUFW2pjHdtYplEbcXa_oIdcURYY7yRgRnnjGDppKEcx9hYQaVOHI72QSMbZ_YAQCctD9gyhCpKmXJK2sREWGvbVtwZ3QS4YnSqS_zx0AZjmNbIyYFxqWdcGhiX8iY4WzzyMgff-G1yq5JeWv6HeerdW85pQOFvgvNKWvXtH4kd_mn2EVgjQdxFlksLNKaTmT0Gq_p1OsgnJ4V6vgOgPd5n
linkProvider Springer Nature
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3dT9swED9BhwR72MbHRLcO_MATyJLr2In9OCEQaG2FBqt4i_yJKvVjaloe-Otnp0nLECDBU2TFuUR3F9-d7-5ngCPOVYirMoU1Iwoz6RyWLtU4kWnwjgmnWdmV1u9kvZ64vZVXVR93UVe71ynJcqVeNbtRnsQyglh8xRIs1uEDCwYrAub_vu4vN1ZIEg-PiG5vxGrDVDJZZzOfo_K_PVo5mU_yoqW5Of_8rg_9Ap8q7xL9XKjDNqy58Q58fIQ5GEbdJVBrsQu9Cl_1Dpl5cANHgwdnkRpbNAlLyWI0DIpQoIlH0_nQFShu3KLRkkZ4W1XiNQpU9uDP-dnN6QWujljAJuFyhk2qPbWaybbSNoQ2qcqStubtcCGeeiqdtcHJIFwITonyyjJBUmKdZMpkniRfoTGejN0-IK-ciNgylGnGuPZaucwmxBjX1sJb0wRSMzo3Ff54PAZjmK-QkyPj8sC4PDIuF004Xj7ydwG-8drkVi29vPoPizyEt0KwiMLfhJNaWqvbLxL79qbZh7B5cdPt5J3L3q_vsEWj6MuKlxY0ZtO5-wEb5n42KKYHpar-AxW94Us
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3da9swED-2rIz1Yf1m2dpVD31aEVVkyZYex7aw0iwUtoW8GX2WQuKUOOlD__pK_kjasg7GnoywfBZ3Z-tOd_c7gBPOVfCrMoU1Iwoz6RyWLtU4kWmwjgmnWVWVNhpkw6EYj-Vl0-e0bLPd25BkXdMQUZqKxdmN9WfrwjfKk5hSEBOxWILFS3jFYh59dNd_jlaHLCSJjSSiCRxx2zCVTLaRzT9Rebw3rQ3OJzHSauvpb_33orfhbWN1os-1muzAC1fswuYDLMIw-rECcC33YNjgrl4hswzm4fT6zlmkCotm4RdTjyZBQUo082i-nLgSxQNdNF3RCG9rUr-mgco-_O5_-_XlO25aL2CTcLnAJtWeWs1kT2kbXJ5UZUlP8164EE89lc7aYHwQLgSnRHllmSApsU4yZTJPkgPoFLPCvQPklRMRc4YyzRjXXiuX2YQY43paeGu6QFqm56bBJY_tMSb5GlE5Mi4PjMsj43LRhU-rR25qUI6_TT5sJZk332eZB7dXCBbR-btw2kpufftZYu__afYxvL782s8H58OLD_CGRslXiTCH0FnMl-4INszt4rqcf6y09h6Jw-ov
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Learning+customized+and+optimized+lists+of+rules+with+mathematical+programming&rft.jtitle=Mathematical+programming+computation&rft.au=Rudin%2C+Cynthia&rft.au=Ertekin%2C+%C5%9Eeyda&rft.date=2018-12-01&rft.issn=1867-2949&rft.eissn=1867-2957&rft.volume=10&rft.issue=4&rft.spage=659&rft.epage=702&rft_id=info:doi/10.1007%2Fs12532-018-0143-8&rft.externalDBID=n%2Fa&rft.externalDocID=10_1007_s12532_018_0143_8
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1867-2949&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1867-2949&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1867-2949&client=summon