Local Minimax Learning of Functions With Best Finite Sample Estimation Error Bounds: Applications to Ridge and Lasso Regression, Boosting, Tree Learning, Kernel Machines, and Inverse Problems
Optimal local estimation is formulated in the minimax sense for inverse problems and nonlinear regression. This theory provides best mean squared finite sample error bounds for some popular statistical learning algorithms and also for several optimal improvements of other existing learning algorithm...
Gespeichert in:
| Veröffentlicht in: | IEEE transactions on information theory Jg. 55; H. 12; S. 5700 - 5727 |
|---|---|
| 1. Verfasser: | |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
New York, NY
IEEE
01.12.2009
Institute of Electrical and Electronics Engineers The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Schlagworte: | |
| ISSN: | 0018-9448, 1557-9654 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | Optimal local estimation is formulated in the minimax sense for inverse problems and nonlinear regression. This theory provides best mean squared finite sample error bounds for some popular statistical learning algorithms and also for several optimal improvements of other existing learning algorithms such as smoothing splines and kernel regularization. The bounds and improved algorithms are not based on asymptotics or Bayesian assumptions and are truly local for each query, not depending on cross validating estimates at other queries to optimize modeling parameters. Results are given for optimal local learning of approximately linear functions with side information (context) using real algebraic geometry. In particular, finite sample error bounds are given for ridge regression and for a local version of lasso regression. The new regression methods require only quadratic programming with linear or quadratic inequality constraints for implementation. Greedy additive expansions are then combined with local minimax learning via a change in metric. An optimal strategy is presented for fusing the local minimax estimators of a class of experts-providing optimal finite sample prediction error bounds from (random) forests. Local minimax learning is extended to kernel machines. Best local prediction error bounds for finite samples are given for Tikhonov regularization. The geometry of reproducing kernel Hilbert space is used to derive improved estimators with finite sample mean squared error (MSE) bounds for class membership probability in two class pattern classification problems. A purely local, cross validation free algorithm is proposed which uses Fisher information with these bounds to determine best local kernel shape in vector machine learning. Finally, a locally quadratic solution to the finite Fourier moments problem is presented. After reading the first three sections the reader may proceed directly to any of the subsequent applications sections. |
|---|---|
| AbstractList | Optimal local estimation is formulated in the minimax sense for inverse problems and nonlinear regression. This theory provides best mean squared finite sample error bounds for some popular statistical learning algorithms and also for several optimal improvements of other existing learning algorithms such as smoothing splines and kernel regularization. The bounds and improved algorithms are not based on asymptotics or Bayesian assumptions and are truly local for each query, not depending on cross validating estimates at other queries to optimize modeling parameters. Results are given for optimal local learning of approximately linear functions with side information (context) using real algebraic geometry. In particular, finite sample error bounds are given for ridge regression and for a local version of lasso regression. The new regression methods require only quadratic programming with linear or quadratic inequality constraints for implementation. Greedy additive expansions are then combined with local minimax learning via a change in metric. An optimal strategy is presented for fusing the local minimax estimators of a class of experts-providing optimal finite sample prediction error bounds from (random) forests. Local minimax learning is extended to kernel machines. Best local prediction error bounds for finite samples are given for Tikhonov regularization. The geometry of reproducing kernel Hilbert space is used to derive improved estimators with finite sample mean squared error (MSE) bounds for class membership probability in two class pattern classification problems. A purely local, cross validation free algorithm is proposed which uses Fisher information with these bounds to determine best local kernel shape in vector machine learning. Finally, a locally quadratic solution to the finite Fourier moments problem is presented. After reading the first three sections the reader may proceed directly to any of the subsequent applications sections. Optimal local estimation is formulated in the minimax sense for inverse problems and nonlinear regression. This theory provides best mean squared finite sample error bounds for some popular statistical learning algorithms and also for several optimal improvements of other existing learning algorithms such as smoothing splines and kernel regularization. The bounds and improved algorithms are not based on asymptotics or Bayesian assumptions and are truly local for each query, not depending on cross validating estimates at other queries to optimize modeling parameters. Results are given for optimal local learning of approximately linear functions with side information (context) using real algebraic geometry. In particular, finite sample error bounds are given for ridge regression and for a local version of lasso regression. The new regression methods require only quadratic programming with linear or quadratic inequality constraints for implementation. Greedy additive expansions are then combined with local minimax learning via a change in metric. An optimal strategy is presented for fusing the local minimax estimators of a class of experts - providing optimal finite sample prediction error bounds from (random) forests. Local minimax learning is extended to kernel machines. Best local prediction error bounds for finite samples are given for Tikhonov regularization. The geometry of reproducing kernel Hilbert space is used to derive improved estimators with finite sample mean squared error (MSE) bounds for class membership probability in two class pattern classification problems. A purely local, cross validation free algorithm is proposed which uses Fisher information with these bounds to determine best local kernel shape in vector machine learning. Finally, a locally quadratic solution to the finite Fourier moments problem is presented. After reading the first three sections the reader may proceed directly to any of the subsequent applications sections. [PUBLICATION ABSTRACT] |
| Author | Jones, L.K. |
| Author_xml | – sequence: 1 givenname: L.K. surname: Jones fullname: Jones, L.K. organization: Dept. of Math. Sci., Univ. of Massachusetts, Lowell, MA, USA |
| BackLink | http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=22204081$$DView record in Pascal Francis |
| BookMark | eNp9kUFv0zAYhiM0JLrBHYmLhYS4NMN24sTmtk0tVBSBoIij5ThfOk-pHfwliP26_TXcteywAxdblp_n06f3Pc1OfPCQZS8ZPWeMqneb1eacU6rSweuyVk-yGROizlUlypNsRimTuSpL-Sw7RbxJz1IwPsvu1sGannx23u3MH7IGE73zWxI6spy8HV3wSH668ZpcAo5kmbgRyHezG3ogCxyTtWfIIsYQyWWYfIvvycUw9M6agz0G8s21WyDGt2RtENMbthEQ0_c8OSGN8ds52USAhw3m5BNED2k1Y6-dB5zf-yv_GyIC-RpD08MOn2dPO9MjvDjeZ9mP5WJz9TFff_mwurpY57YQfMyNgJY3ihtqC247C1TymtLGdq1oQEjJKmU7WTdV27G2qFkhK8Ok5A1j3DRFcZa9PcwdYvg1pSj0zqGFvjcewoRaVkoKqqhM5OtH5E2Yok_LaaaEKjjjLEFvjpDBlH8XjbcO9RBTnvFWc85pSeWeqw6cjQExQqetG-9zHaNxvWZU79vXqX29b18f208ifST-m_0f5dVBcQDwgIuCqVqUxV831r4F |
| CODEN | IETTAW |
| CitedBy_id | crossref_primary_10_1007_s43670_025_00099_z crossref_primary_10_3390_vision4020025 crossref_primary_10_1016_j_neunet_2020_02_007 crossref_primary_10_1088_1742_6596_1038_1_012043 crossref_primary_10_1109_ACCESS_2025_3607516 crossref_primary_10_1088_1742_6596_1135_1_012026 crossref_primary_10_1186_1755_8794_4_10 crossref_primary_10_1109_TSMCC_2011_2177969 |
| Cites_doi | 10.1109/72.914517 10.1006/acha.1998.0248 10.2307/2281645 10.1109/78.258082 10.1109/18.256500 10.1023/A:1006559212014 10.1214/aos/1015951994 10.1145/168304.168357 10.1007/978-1-4757-2440-0 10.1214/aos/1176344315 10.1109/18.556601 10.1109/18.50370 10.1109/5.58342 10.1080/01621459.1981.10477729 10.1017/S0962492900002919 10.1006/jath.1996.0031 10.1109/5.58326 10.1073/pnas.211566398 10.1162/neco.1993.5.6.893 10.1198/106186002510 10.1007/BF02551274 10.1023/A:1010933404324 10.1214/aos/1015957398 10.1109/18.312166 10.1090/psapm/047/1268002 10.1016/S0167-9473(98)00063-2 10.1080/07362999208809264 10.1007/BF02124742 10.1214/aos/1176348546 10.1214/aos/1176350382 |
| ContentType | Journal Article |
| Copyright | 2015 INIST-CNRS Copyright Institute of Electrical and Electronics Engineers, Inc. (IEEE) Dec 2009 |
| Copyright_xml | – notice: 2015 INIST-CNRS – notice: Copyright Institute of Electrical and Electronics Engineers, Inc. (IEEE) Dec 2009 |
| DBID | 97E RIA RIE AAYXX CITATION IQODW 7SC 7SP 8FD JQ2 L7M L~C L~D F28 FR3 |
| DOI | 10.1109/TIT.2009.2027479 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Xplore Digital Library CrossRef Pascal-Francis Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional ANTE: Abstracts in New Technology & Engineering Engineering Research Database |
| DatabaseTitle | CrossRef Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional Engineering Research Database ANTE: Abstracts in New Technology & Engineering |
| DatabaseTitleList | Technology Research Database Technology Research Database |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Xplore Digital Library url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Computer Science Applied Sciences |
| EISSN | 1557-9654 |
| EndPage | 5727 |
| ExternalDocumentID | 1904723111 22204081 10_1109_TIT_2009_2027479 5319754 |
| Genre | orig-research Feature |
| GroupedDBID | -~X .DC 0R~ 29I 3EH 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABFSI ABQJQ ABVLG ACGFO ACGFS ACGOD ACIWK AENEX AETEA AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ H~9 IAAWW IBMZZ ICLAB IDIHD IFIPE IFJZH IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNS RXW TAE TN5 VH1 VJK AAYXX CITATION IQODW RIG 7SC 7SP 8FD JQ2 L7M L~C L~D F28 FR3 |
| ID | FETCH-LOGICAL-c352t-a5ed2b92a0c32cfce082700bcfd5be588169cf87b6df1d371386a1882b112ab33 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 7 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000271951500023&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0018-9448 |
| IngestDate | Sat Sep 27 23:46:13 EDT 2025 Sun Nov 30 04:03:19 EST 2025 Mon Jul 21 09:11:57 EDT 2025 Sat Nov 29 03:53:08 EST 2025 Tue Nov 18 21:39:51 EST 2025 Tue Aug 26 16:47:33 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 12 |
| Keywords | Query Optimal estimation Modeling Implementation Spline Mean square error Learning Optimal strategy Hilbert space Learning algorithm minimax Estimation error Smoothing methods Non linear regression ridge regression Regression analysis Pattern recognition Quadratic programming reproducing kernel Kernel method Inverse problem Pattern classification Minimax method Metric Fusion Algebraic geometry |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html CC BY 4.0 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c352t-a5ed2b92a0c32cfce082700bcfd5be588169cf87b6df1d371386a1882b112ab33 |
| Notes | SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 14 ObjectType-Article-2 content type line 23 |
| PQID | 195932121 |
| PQPubID | 36024 |
| PageCount | 28 |
| ParticipantIDs | ieee_primary_5319754 crossref_citationtrail_10_1109_TIT_2009_2027479 proquest_miscellaneous_869850908 crossref_primary_10_1109_TIT_2009_2027479 pascalfrancis_primary_22204081 proquest_journals_195932121 |
| PublicationCentury | 2000 |
| PublicationDate | 2009-12-01 |
| PublicationDateYYYYMMDD | 2009-12-01 |
| PublicationDate_xml | – month: 12 year: 2009 text: 2009-12-01 day: 01 |
| PublicationDecade | 2000 |
| PublicationPlace | New York, NY |
| PublicationPlace_xml | – name: New York, NY – name: New York |
| PublicationTitle | IEEE transactions on information theory |
| PublicationTitleAbbrev | TIT |
| PublicationYear | 2009 |
| Publisher | IEEE Institute of Electrical and Electronics Engineers The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: Institute of Electrical and Electronics Engineers – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref35 ref34 ref12 freund (ref14) 1996 ref36 ref31 ref30 ref33 ref11 ref10 rifkin (ref39) 2002 donoho (ref13) 1993; 27 jorgensen (ref27) 1993 ref2 ref17 ref38 ref16 breiman (ref5) 1986 hull (ref19) 2006 friedman (ref15) 1999 cleveland (ref9) 1994 ref24 ref23 ref26 ref25 breiman (ref7) 2004 ref20 ahlswede (ref1) 0 ref22 ref44 ref21 ref43 host (ref18) 1999; 29 sundberg (ref42) 2002; 4 stone (ref41) 1990; 52 ref29 ref8 juditsky (ref28) 2000; 28 ref3 barron (ref4) 1991 ref6 poggio (ref37) 2003; 50 ref40 mangasarian (ref32) 1998 |
| References_xml | – ident: ref33 doi: 10.1109/72.914517 – ident: ref8 doi: 10.1006/acha.1998.0248 – ident: ref17 doi: 10.2307/2281645 – year: 1994 ident: ref9 publication-title: Computational Methods for Local Regression – ident: ref31 doi: 10.1109/78.258082 – year: 2002 ident: ref39 publication-title: Everything old is new again A fresh look at historical approaches in machine learning – ident: ref3 doi: 10.1109/18.256500 – year: 1993 ident: ref27 publication-title: The Theory of Linear Models – year: 1986 ident: ref5 publication-title: Classification and Regression Trees – ident: ref2 doi: 10.1023/A:1006559212014 – volume: 52 start-page: 237 year: 1990 ident: ref41 article-title: continuum regression: cross-validated sequentially constructed prediction embracing ordinary least squares, partial least squares and principle components regression (with discussion) publication-title: J Roy Statist Soc – volume: 28 start-page: 681 year: 2000 ident: ref28 article-title: functional aggregation for nonparametric regression publication-title: Ann Statist doi: 10.1214/aos/1015951994 – ident: ref11 doi: 10.1145/168304.168357 – volume: 50 start-page: 537 year: 2003 ident: ref37 article-title: the mathematics of learning: dealing with data publication-title: Notices Amer Math Soc – year: 1999 ident: ref15 publication-title: Greedy Function Approximation A Gradient Boosting Machine – ident: ref43 doi: 10.1007/978-1-4757-2440-0 – ident: ref40 doi: 10.1214/aos/1176344315 – ident: ref29 doi: 10.1109/18.556601 – ident: ref25 doi: 10.1109/18.50370 – ident: ref21 doi: 10.1109/5.58342 – year: 1991 ident: ref4 publication-title: Complexity Regularization with Applications to Artificial Neural Networks Nonparametric Functional Estimation and Related Topics – ident: ref16 doi: 10.1080/01621459.1981.10477729 – year: 2006 ident: ref19 publication-title: Options Futures and Other Derivatives – ident: ref35 doi: 10.1017/S0962492900002919 – year: 0 ident: ref1 publication-title: Personal Communication Concerning Handwritten Letters of Karl Gauss – ident: ref30 doi: 10.1006/jath.1996.0031 – ident: ref36 doi: 10.1109/5.58326 – year: 2004 ident: ref7 publication-title: Consistency for a Simplified Model of Random Forests – ident: ref34 doi: 10.1073/pnas.211566398 – ident: ref44 doi: 10.1162/neco.1993.5.6.893 – ident: ref26 doi: 10.1198/106186002510 – start-page: 148 year: 1996 ident: ref14 article-title: experiments with a new boosting algorithm publication-title: Proc 13th Int Conf Machine Learning – ident: ref10 doi: 10.1007/BF02551274 – ident: ref6 doi: 10.1023/A:1010933404324 – ident: ref23 doi: 10.1214/aos/1015957398 – volume: 4 start-page: 1994 year: 2002 ident: ref42 publication-title: Shrinkage Regression Encycl of Environ – ident: ref22 doi: 10.1109/18.312166 – volume: 27 start-page: 173 year: 1993 ident: ref13 article-title: nonlinear wavelet methods for recovery of signals, densities, and spectra from indirect and noisy data, different perspectives on wavelets publication-title: Proc AMS Symp Appl Math doi: 10.1090/psapm/047/1268002 – volume: 29 start-page: 295 year: 1999 ident: ref18 article-title: kriging by local polynomials publication-title: Comp Statistics Data Anal doi: 10.1016/S0167-9473(98)00063-2 – ident: ref38 doi: 10.1080/07362999208809264 – year: 1998 ident: ref32 publication-title: Successive Overrelaxation for Support Vector Machines – ident: ref12 doi: 10.1007/BF02124742 – ident: ref20 doi: 10.1214/aos/1176348546 – ident: ref24 doi: 10.1214/aos/1176350382 |
| SSID | ssj0014512 |
| Score | 2.003547 |
| Snippet | Optimal local estimation is formulated in the minimax sense for inverse problems and nonlinear regression. This theory provides best mean squared finite sample... |
| SourceID | proquest pascalfrancis crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 5700 |
| SubjectTerms | Algorithms Applied sciences Boosting Errors Estimating techniques Estimation error Exact sciences and technology Fusion Information geometry Information theory Information, signal and communications theory inverse problem Inverse problems Kernel Learning Machine learning Mathematical analysis Mathematical models minimax Minimax technique Minimax techniques Parameter optimization Pattern recognition Regression analysis Regression tree analysis reproducing kernel ridge regression Samples Signal processing Smoothing methods Statistical analysis Statistical learning Statistical methods Telecommunications and information theory Validity |
| Title | Local Minimax Learning of Functions With Best Finite Sample Estimation Error Bounds: Applications to Ridge and Lasso Regression, Boosting, Tree Learning, Kernel Machines, and Inverse Problems |
| URI | https://ieeexplore.ieee.org/document/5319754 https://www.proquest.com/docview/195932121 https://www.proquest.com/docview/869850908 |
| Volume | 55 |
| WOSCitedRecordID | wos000271951500023&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Xplore Digital Library customDbUrl: eissn: 1557-9654 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014512 issn: 0018-9448 databaseCode: RIE dateStart: 19630101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwELXaigMcKLQgtoVqDlyQEtb5trm1aFcglqqCRfQWOc64jQRJlWQR_46_xtjJRq1ASNwSOVaseGy_ycy8x9hLHgmtUyQ31YSJH8tU-SJR6MciLrjW2hjjKPNX2fm5uLyUFzvMm2phENEln-Fre-li-WWjN_ZX2dzaS5bEu2w3y9KhVmuKGMRJMDCDB7SAyefYhiS5nK_frwdiytD5YPLOEeQ0VWxGpOroo5hBzeKPjdmdNsv9_xvnI_ZwRJVwOpjBY7aD9QHb3yo2wLiAD9iDW_SDh-zXyh5k8LGqq-_qJ4xUq1fQGFjScecsEr5W_TWc0RBhWVl8Cp-VJRSGBe0NQ9kjLNq2aeHMCjR1b-D0Vkgc-gY-2YowUHUJKwLqdI9XQ-5t7VGfprN51x6sW8RpBB58wLZGGppL9cTOc_0tJ0jbIVwMIjjdE_ZluVi_feePgg6-JpzX-yrBMixkqLiOQm00Ev7IOC-0KZMCEyGCVGojsiItTVBG5D-LVAXkAxSEClURRU_ZXt3U-IyBilCXIZY8tYIpEmWgdaZlEkdhqckAZ2y-neNcj2znVnTjW-68Hi5zsgqrwSnz0Spm7NXU42Zg-vjHs4d21qfnxgmfsZM7ZjS1EwijzVIEM3a8tat83Cu63NL7RIQgqBWmVlrkNnKjamw2XS5SKQjZcXH09xcfs_vhqGvBg-dsr283-ILd0z_6qmtP3EL5DamYE30 |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwELVKQQIOFNqiLoUyBy5IG9b5trm1aFetmq4qWERvkeNMSiSaoCSL-Hf8NcZONmoFQuKWyLFixWP7TWbmPcbecF9oHSG5qYUXOoGMlCNChU4ggoxrrYuisJT5SbxciqsrebnFpmMtDCLa5DN8Zy5tLD-v9dr8KpsZe4nD4B67HwaBx_tqrTFmEIRuzw3u0hImr2MTlORytjpb9dSUnvXC5J1DyKqqmJxI1dJnKXo9iz-2ZnveLHb-b6RP2ZMBV8JxbwjP2BZWu2xno9kAwxLeZY9vERDusV-JOcrgoqzKG_UTBrLVa6gLWNCBZ20SvpTdVzihIcKiNAgVPilDKQxz2h36wkeYN03dwImRaGrfw_GtoDh0NXw0NWGgqhwSgup0j9d99m01pT51azKvp7BqEMcRTOEcmwppaDbZE9up7W9YQZoW4bKXwWn32efFfPXh1BkkHRxNSK9zVIi5l0lPce17utBICCTmPNNFHmYYCuFGUhcizqK8cHOfPGgRKZe8gIxwocp8_znbruoKDxgoH3XuYc4jI5kiUbpax1qGge_lmkxwwmabOU71wHduZDe-pdbv4TIlqzAqnDIdrGLC3o49vvdcH_94ds_M-vjcMOETdnTHjMZ2gmG0XQp3wg43dpUOu0WbGoIfnzAEtcLYSsvcxG5UhfW6TUUkBWE7Ll78_cWv2cPT1UWSJmfL80P2yBtULrj7km13zRpfsQf6R1e2zZFdNL8BFHIWxA |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Local+Minimax+Learning+of+Functions+With+Best+Finite+Sample+Estimation+Error+Bounds%3A+Applications+to+Ridge+and+Lasso+Regression%2C+Boosting%2C+Tree+Learning%2C+Kernel+Machines%2C+and+Inverse+Problems&rft.jtitle=IEEE+transactions+on+information+theory&rft.au=Jones%2C+L.K.&rft.date=2009-12-01&rft.pub=IEEE&rft.issn=0018-9448&rft.volume=55&rft.issue=12&rft.spage=5700&rft.epage=5727&rft_id=info:doi/10.1109%2FTIT.2009.2027479&rft.externalDocID=5319754 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0018-9448&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0018-9448&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0018-9448&client=summon |